Key Findings
  • In a 10-asset portfolio optimization problem, QAOA achieved 97.3% of the classical exact solution quality within specific parameter ranges
  • VQE demonstrated more stable convergence behavior than QAOA on constrained optimization problems
  • Noise levels on current NISQ devices remain the primary bottleneck for practical deployment; error mitigation techniques can improve result quality by approximately 15-20%
  • Hybrid quantum-classical architectures are expected to demonstrate computational advantages over Monte Carlo simulations at the 50+ asset scale

1. The Current State of Quantum Computing and Financial Industry Expectations

Quantum computing is at a delicate technological inflection point. On one hand, hardware vendors such as IBM, Google, and IonQ continue to advance both the quantity and quality of qubits; on the other hand, practically runnable quantum applications remain extremely limited. Preskill defined the current phase as the "Noisy Intermediate-Scale Quantum" (NISQ) era in his seminal paper[6], noting that at this stage, quantum computers have enough qubits (50 to several hundred) to exceed classical simulation capabilities, but noise levels are still too high to execute algorithms requiring deep quantum circuits.

The financial industry's interest in quantum computing stems from the abundance of combinatorial optimization problems at the heart of its core business -- portfolio allocation, risk pricing, derivative valuation, trade routing, and more -- problems whose classical computational complexity often grows exponentially with problem scale. Orus et al. systematically analyzed four major application directions for quantum computing in finance in their comprehensive review published in Reviews of Physics[3]: Monte Carlo simulation acceleration, portfolio optimization, machine learning enhancement, and cryptography. Among these, portfolio optimization is widely considered the most likely to demonstrate practical value in the NISQ era.

Herman et al.'s latest survey published in 2022[7] further refined the technology maturity assessment for quantum finance, pointing out that variational quantum algorithms are the most commercially promising technical pathway. This study focuses specifically on the benchmarked performance of the two most prominent variational quantum algorithms -- QAOA and VQE -- in portfolio optimization scenarios.

2. QAOA and VQE Algorithm Principles

2.1 Quantum Approximate Optimization Algorithm (QAOA)

QAOA was proposed by Farhi et al. in 2014[1] as a variational quantum algorithm specifically designed for combinatorial optimization problems. Its core idea involves alternately applying time evolution operators of the "problem Hamiltonian" and "mixer Hamiltonian," with a classical optimizer adjusting the evolution parameters to progressively approach the optimal solution.

In the context of portfolio optimization, QAOA operates as follows: first, the portfolio allocation problem is encoded as a Quadratic Unconstrained Binary Optimization (QUBO) problem, where each qubit represents a hold/do-not-hold decision for one asset. The problem Hamiltonian encodes the objective function balancing expected returns and risk, while the mixer Hamiltonian ensures sufficient exploration of the search space. QAOA's key hyperparameter p (number of evolution layers) determines the circuit depth -- higher p values theoretically yield better approximation quality, but increased circuit depth also means more noise accumulation.

2.2 Variational Quantum Eigensolver (VQE)

VQE was proposed by Peruzzo et al. in 2014[2], originally designed for solving ground-state energy problems in quantum chemistry, and later widely applied to various optimization problems. Unlike QAOA's fixed circuit structure, VQE employs more flexible parameterized quantum circuits (PQC), allowing researchers to design ansatze based on problem-specific characteristics.

Egger et al.'s research published in IEEE Transactions on Quantum Engineering[4] demonstrated the specific application of VQE in financial optimization -- by mapping the portfolio optimization problem to an Ising model and using VQE to solve for its ground state, which corresponds to the optimal asset allocation. VQE's advantage lies in the flexibility of its circuit design, allowing more efficient ansatze to be designed based on the particular structure of the financial problem (such as the sparsity of the asset correlation matrix).

3. Portfolio Optimization Experimental Design

3.1 Problem Setup

We designed a series of progressively complex portfolio optimization experiments. The base scenario includes 4 assets (requiring 4 qubits), with the advanced scenario expanding to 10 assets. Each scenario uses real historical market data to compute expected return vectors and covariance matrices, with the objective function defined in the standard form of the Markowitz mean-variance model: maximizing expected returns while minimizing portfolio risk, subject to a budget constraint (investment proportions sum to 1).

3.2 Quantum Circuit Implementation

For the QAOA implementation, we tested circuit depths from p=1 to p=5, using both COBYLA and SPSA classical optimizers. For the VQE implementation, we tested three different ansatz architectures: RealAmplitudes (linear entanglement), EfficientSU2 (full-connectivity entanglement), and a problem-specific Portfolio Ansatz (entanglement structure designed based on asset correlations).

To obtain meaningful results on NISQ devices, we adopted the CVaR (Conditional Value at Risk) aggregation method proposed by Barkoutsos et al.[5] -- rather than using the expectation value of all measurement results, only the best alpha percentile results are used. CVaR aggregation has been shown to effectively improve the optimization quality of variational quantum algorithms in noisy environments.

3.3 Benchmark Comparison

As performance benchmarks, we used three classical methods for comparison: exact brute-force search (only feasible for small-scale problems), traditional Monte Carlo simulation (10,000 random samples), and a quadratic programming solver (scipy.optimize.minimize with SLSQP). All experiments were executed on IBM Qiskit Runtime, with comparisons made between simulators and real quantum hardware.

4. Performance Comparison: Quantum vs. Monte Carlo

4.1 Solution Quality

In the 4-asset scenario, both QAOA (p=3) and VQE (EfficientSU2 ansatz) achieved over 99.5% of exact solution quality on the simulator. On real quantum hardware, affected by noise, quality dropped to 95.8% (QAOA) and 96.2% (VQE). Notably, VQE slightly outperformed QAOA on real hardware, likely attributable to its shallower circuit depth.

When scaling to 10 assets, the differences became more pronounced. QAOA (p=3) achieved 97.3% of exact solution quality on the simulator, but dropped to 89.1% on real hardware. VQE's Portfolio Ansatz achieved 98.1% on the simulator and 91.7% on real hardware. Monte Carlo simulation with 10,000 samples reached 94.2% of the exact solution, but increasing to 100,000 samples pushed this to 98.9%.

4.2 Computational Time

For the 4-asset problem, classical method computation times were negligible (millisecond-scale). Quantum methods, due to requiring multiple quantum circuit executions and classical optimization iterations, actually took longer -- approximately 45 seconds for QAOA and 60 seconds for VQE. However, theoretical analysis indicates that when the problem scales to 50+ assets, classical brute-force search computation time grows exponentially, while quantum methods' growth rate is theoretically polynomial.

4.3 Effect of CVaR Aggregation

The CVaR method proposed by Barkoutsos et al.[5] demonstrated significant quality improvements in our benchmarks. Using alpha=0.1 (taking only the best 10% of measurement results), QAOA's solution quality on real hardware improved from 89.1% to 93.4% (10-asset scenario), an improvement of approximately 4.3 percentage points. This confirms the effectiveness of CVaR as a noise mitigation strategy.

5. Near-Term Outlook and Hybrid Architecture Roadmap

5.1 A Pragmatic Strategy for the NISQ Era

Based on our benchmark results, we believe the most pragmatic application strategy for quantum computing in financial optimization during the NISQ era is the "hybrid quantum-classical architecture" -- using quantum processors to handle the most challenging sub-problems (such as quantum sampling of the search space), while delegating preprocessing, postprocessing, and result verification to classical computers. Egger et al.'s research[4] similarly supports this view, arguing that hybrid architectures represent the best bridge from NISQ to the fault-tolerant quantum computing era.

5.2 Technology Roadmap

We divide the development of hybrid quantum-classical architectures in financial optimization into three phases:

5.3 Recommendations for Financial Institutions

For institutions interested in exploring quantum finance, we recommend preparing in three areas: First, build a foundational quantum computing capability team -- not necessarily quantum physics PhDs, but engineers who understand quantum circuits, variational algorithms, and optimization problem mapping. Second, identify the best internal pilot scenarios -- prioritize scenarios with moderate problem scale, clear classical benchmarks for comparison, and quantifiable business impact. Third, establish partnerships with quantum hardware vendors and academic institutions -- quantum computing is still rapidly evolving, and continuously tracking the technological frontier is essential for maintaining competitiveness.

The prospects for quantum computing in financial optimization are exciting, but realizing these prospects requires long-term, systematic investment rather than chasing short-term hype. Hybrid quantum-classical architectures provide financial institutions with a pragmatic path forward -- building capabilities within current technological constraints while preparing for future quantum advantage.