Skip to content

Tools for quantum machine learning

Qadence offers a wide range of utilities for helping building and researching quantum machine learning algorithms, including:

  • a set of constructors for circuits commonly used in quantum machine learning
  • a set of tools for optimizing quantum neural networks and loading classical data into a QML algorithm

Quantum machine learning constructors

Besides the arbitrary Hamiltonian constructors, Qadence also provides a complete set of program constructors useful for digital-analog quantum machine learning programs.

Feature maps

A few feature maps are directly available for loading classical data into quantum circuits by encoding them into gate rotation angles.

from qadence import feature_map

n_qubits = 3

fm = feature_map(n_qubits, fm_type="fourier")

fm = feature_map(n_qubits, fm_type="chebyshev")

fm = feature_map(n_qubits, fm_type="tower")
Fourier = KronBlock(0,1,2) [tag: FM]
├── RX(0) [params: ['phi']]
├── RX(1) [params: ['phi']]
└── RX(2) [params: ['phi']]
Chebyshev KronBlock(0,1,2) [tag: FM]
├── RX(0) [params: ['2*acos(phi)']]
├── RX(1) [params: ['2*acos(phi)']]
└── RX(2) [params: ['2*acos(phi)']]
Tower KronBlock(0,1,2) [tag: FM]
├── RX(0) [params: ['2*acos(phi)']]
├── RX(1) [params: ['4*acos(phi)']]
└── RX(2) [params: ['6*acos(phi)']]

Hardware-efficient ansatz

Ansatze blocks for quantum machine-learning are typically built following the Hardware-Efficient Ansatz formalism (HEA). Both fully digital and digital-analog HEAs can easily be built with the hea function. By default, the digital version is returned:

from qadence import hea
from qadence.draw import display

n_qubits = 3
depth = 2

ansatz = hea(n_qubits, depth)
8b3598b12a694350b13f119dba80017a 0 6adfaa2787e444efbf0a440530c3305b RX(theta₀) 8b3598b12a694350b13f119dba80017a--6adfaa2787e444efbf0a440530c3305b 76e4ec2ea6014d76a22e705cbc86a7ac 1 a4081bdf670048e3a6705fbddb81c335 RY(theta₃) 6adfaa2787e444efbf0a440530c3305b--a4081bdf670048e3a6705fbddb81c335 ac05f951798c4cdeb1db4f76b044807b RX(theta₆) a4081bdf670048e3a6705fbddb81c335--ac05f951798c4cdeb1db4f76b044807b d6c562e68b134a72b828733774558e00 ac05f951798c4cdeb1db4f76b044807b--d6c562e68b134a72b828733774558e00 342ae26459a84fe68624279cfd44ff2f d6c562e68b134a72b828733774558e00--342ae26459a84fe68624279cfd44ff2f c1c8202e90b94f4fb3f7e5f4dd76c903 RX(theta₉) 342ae26459a84fe68624279cfd44ff2f--c1c8202e90b94f4fb3f7e5f4dd76c903 8e2501274d6b4609a0e80c14f5151062 RY(theta₁₂) c1c8202e90b94f4fb3f7e5f4dd76c903--8e2501274d6b4609a0e80c14f5151062 c3477e78d0834621b00d4a2e07a3fa79 RX(theta₁₅) 8e2501274d6b4609a0e80c14f5151062--c3477e78d0834621b00d4a2e07a3fa79 7efca9e422624118be8db0c472c87e06 c3477e78d0834621b00d4a2e07a3fa79--7efca9e422624118be8db0c472c87e06 1e69c32ef1864c05823319c076e63cd6 7efca9e422624118be8db0c472c87e06--1e69c32ef1864c05823319c076e63cd6 524b1380152b4941af3bf27eb01ccbe0 1e69c32ef1864c05823319c076e63cd6--524b1380152b4941af3bf27eb01ccbe0 14640ffb77744b088c8956830da7bb6b c392814bc5c643b8a4440aee4295d788 RX(theta₁) 76e4ec2ea6014d76a22e705cbc86a7ac--c392814bc5c643b8a4440aee4295d788 fa38c2011e9140eeb985a34211660501 2 aa82c694f1404d65bec23b328cd3437a RY(theta₄) c392814bc5c643b8a4440aee4295d788--aa82c694f1404d65bec23b328cd3437a bc3315c9e74a448dbf787ecab41f5755 RX(theta₇) aa82c694f1404d65bec23b328cd3437a--bc3315c9e74a448dbf787ecab41f5755 b7429f7e019b457d9959cea7bdce73cd X bc3315c9e74a448dbf787ecab41f5755--b7429f7e019b457d9959cea7bdce73cd b7429f7e019b457d9959cea7bdce73cd--d6c562e68b134a72b828733774558e00 3e943c2a09b047989001c13c5507d05a b7429f7e019b457d9959cea7bdce73cd--3e943c2a09b047989001c13c5507d05a 42a38ccc066e4c75b958f68828dfa6c5 RX(theta₁₀) 3e943c2a09b047989001c13c5507d05a--42a38ccc066e4c75b958f68828dfa6c5 37c4bf2316fa4127965af60e75e754eb RY(theta₁₃) 42a38ccc066e4c75b958f68828dfa6c5--37c4bf2316fa4127965af60e75e754eb 45fa0f96bdb54ee9b2df5d99e17a8459 RX(theta₁₆) 37c4bf2316fa4127965af60e75e754eb--45fa0f96bdb54ee9b2df5d99e17a8459 14a2d78386cc442b83d04c8d09cba021 X 45fa0f96bdb54ee9b2df5d99e17a8459--14a2d78386cc442b83d04c8d09cba021 14a2d78386cc442b83d04c8d09cba021--7efca9e422624118be8db0c472c87e06 40fab3c782eb4e82bce032b3512d18e9 14a2d78386cc442b83d04c8d09cba021--40fab3c782eb4e82bce032b3512d18e9 40fab3c782eb4e82bce032b3512d18e9--14640ffb77744b088c8956830da7bb6b b9fc093ee1e14269b8ce4d1523ca4844 45e95fbc173f419087a2c7c96ccafdf8 RX(theta₂) fa38c2011e9140eeb985a34211660501--45e95fbc173f419087a2c7c96ccafdf8 6c7b360343d645c9be571fdc2695d582 RY(theta₅) 45e95fbc173f419087a2c7c96ccafdf8--6c7b360343d645c9be571fdc2695d582 a340f62b7d5a4f499c3819c9a43eece3 RX(theta₈) 6c7b360343d645c9be571fdc2695d582--a340f62b7d5a4f499c3819c9a43eece3 e5daaf0a2d034518a7a7eedb1ebf6218 a340f62b7d5a4f499c3819c9a43eece3--e5daaf0a2d034518a7a7eedb1ebf6218 99985b9031924aceac4321783e12f4f7 X e5daaf0a2d034518a7a7eedb1ebf6218--99985b9031924aceac4321783e12f4f7 99985b9031924aceac4321783e12f4f7--3e943c2a09b047989001c13c5507d05a 0f1c604ce7444cfba2f73e237ea50ed3 RX(theta₁₁) 99985b9031924aceac4321783e12f4f7--0f1c604ce7444cfba2f73e237ea50ed3 58951e612b0943849b11e5dcb864ecbc RY(theta₁₄) 0f1c604ce7444cfba2f73e237ea50ed3--58951e612b0943849b11e5dcb864ecbc 77868101f8d244a8b6e097b2abef06df RX(theta₁₇) 58951e612b0943849b11e5dcb864ecbc--77868101f8d244a8b6e097b2abef06df 8a00cd2ee07e4210b2741c00088d44f6 77868101f8d244a8b6e097b2abef06df--8a00cd2ee07e4210b2741c00088d44f6 47b7b63dae7c4a73b6d78e9f22e19ca8 X 8a00cd2ee07e4210b2741c00088d44f6--47b7b63dae7c4a73b6d78e9f22e19ca8 47b7b63dae7c4a73b6d78e9f22e19ca8--40fab3c782eb4e82bce032b3512d18e9 47b7b63dae7c4a73b6d78e9f22e19ca8--b9fc093ee1e14269b8ce4d1523ca4844

As seen above, the rotation layers are automatically parameterized, and the prefix "theta" can be changed with the param_prefix argument.

Furthermore, both the single-qubit rotations and the two-qubit entangler can be customized with the operations and entangler argument. The operations can be passed as a list of single-qubit rotations, while the entangler should be either CNOT, CZ, CRX, CRY, CRZ or CPHASE.

from qadence import RX, RY, CPHASE

ansatz = hea(
    n_qubits=n_qubits,
    depth=depth,
    param_prefix="phi",
    operations=[RX, RY, RX],
    entangler=CPHASE
)
375e2f5bb11a43e2be1e4f5ce408b458 0 13a023eec59042abb6c3179881b0fb24 RX(phi₀) 375e2f5bb11a43e2be1e4f5ce408b458--13a023eec59042abb6c3179881b0fb24 3d01051b39f045aaab1a67cfba51c20f 1 68a5b43c088e4c2ebc024fb33efdb235 RY(phi₃) 13a023eec59042abb6c3179881b0fb24--68a5b43c088e4c2ebc024fb33efdb235 4f6df5947ad741c5ad6714028079d7b9 RX(phi₆) 68a5b43c088e4c2ebc024fb33efdb235--4f6df5947ad741c5ad6714028079d7b9 bb19429f7ef54987be53a7d81cb36a21 4f6df5947ad741c5ad6714028079d7b9--bb19429f7ef54987be53a7d81cb36a21 cfd56d8f2ac148ef9040d0b63486b496 bb19429f7ef54987be53a7d81cb36a21--cfd56d8f2ac148ef9040d0b63486b496 eb26107617e642e1835dfbedf880b947 RX(phi₉) cfd56d8f2ac148ef9040d0b63486b496--eb26107617e642e1835dfbedf880b947 eca92183be5f4e5382938ce2338d27d4 RY(phi₁₂) eb26107617e642e1835dfbedf880b947--eca92183be5f4e5382938ce2338d27d4 7f7c37456fc94fff883dc9114df737be RX(phi₁₅) eca92183be5f4e5382938ce2338d27d4--7f7c37456fc94fff883dc9114df737be 898402580e2242ccbd56c1c7cc347a56 7f7c37456fc94fff883dc9114df737be--898402580e2242ccbd56c1c7cc347a56 7e6cca96a8234675970f18ef5f0bb4c0 898402580e2242ccbd56c1c7cc347a56--7e6cca96a8234675970f18ef5f0bb4c0 6e0fbb5b9e9b40d6b6214776ce58dc72 7e6cca96a8234675970f18ef5f0bb4c0--6e0fbb5b9e9b40d6b6214776ce58dc72 d1c309adf37b4e6db0f80bfebf5cf855 3b91df2389154041b9a626d9e2aa54bf RX(phi₁) 3d01051b39f045aaab1a67cfba51c20f--3b91df2389154041b9a626d9e2aa54bf b15cdbbcd40e4a1e89bff1b90def586e 2 3a14f84e87c840d0b109cd44e293d2b8 RY(phi₄) 3b91df2389154041b9a626d9e2aa54bf--3a14f84e87c840d0b109cd44e293d2b8 8b5b77f4e42e4b33b0048e217a2536bc RX(phi₇) 3a14f84e87c840d0b109cd44e293d2b8--8b5b77f4e42e4b33b0048e217a2536bc fc9169404d1a416face9726a68d93a4d PHASE(phi_ent₀) 8b5b77f4e42e4b33b0048e217a2536bc--fc9169404d1a416face9726a68d93a4d fc9169404d1a416face9726a68d93a4d--bb19429f7ef54987be53a7d81cb36a21 a8a98c1028f744a392e393e8ad7251e0 fc9169404d1a416face9726a68d93a4d--a8a98c1028f744a392e393e8ad7251e0 c4c6bdb124214cc79d2e1ee8154095b3 RX(phi₁₀) a8a98c1028f744a392e393e8ad7251e0--c4c6bdb124214cc79d2e1ee8154095b3 0fe7dabaa6374a8dafb3e0fdfdb5c6fc RY(phi₁₃) c4c6bdb124214cc79d2e1ee8154095b3--0fe7dabaa6374a8dafb3e0fdfdb5c6fc e3ae918aff384d94b2094272f34d6e86 RX(phi₁₆) 0fe7dabaa6374a8dafb3e0fdfdb5c6fc--e3ae918aff384d94b2094272f34d6e86 9b9eca81a1604e35bce602f74ef11690 PHASE(phi_ent₂) e3ae918aff384d94b2094272f34d6e86--9b9eca81a1604e35bce602f74ef11690 9b9eca81a1604e35bce602f74ef11690--898402580e2242ccbd56c1c7cc347a56 cd13d5aff36847878320b6179b2ae689 9b9eca81a1604e35bce602f74ef11690--cd13d5aff36847878320b6179b2ae689 cd13d5aff36847878320b6179b2ae689--d1c309adf37b4e6db0f80bfebf5cf855 252ec8e1c90642718a9379552bcc7799 09086c2e1a274d1f9d1559fe3c49d89f RX(phi₂) b15cdbbcd40e4a1e89bff1b90def586e--09086c2e1a274d1f9d1559fe3c49d89f 29d9644193a449359dcfe486c2bb3868 RY(phi₅) 09086c2e1a274d1f9d1559fe3c49d89f--29d9644193a449359dcfe486c2bb3868 42a363d3c336400f93891411b58b419d RX(phi₈) 29d9644193a449359dcfe486c2bb3868--42a363d3c336400f93891411b58b419d e0bc82749f184331b89ce0fc50d58bea 42a363d3c336400f93891411b58b419d--e0bc82749f184331b89ce0fc50d58bea c3ca37e2f053429aa6723f8e90f04997 PHASE(phi_ent₁) e0bc82749f184331b89ce0fc50d58bea--c3ca37e2f053429aa6723f8e90f04997 c3ca37e2f053429aa6723f8e90f04997--a8a98c1028f744a392e393e8ad7251e0 5a7312f635214694be38d8a889aa1fab RX(phi₁₁) c3ca37e2f053429aa6723f8e90f04997--5a7312f635214694be38d8a889aa1fab d64d1847e25843d488e455a268a3e979 RY(phi₁₄) 5a7312f635214694be38d8a889aa1fab--d64d1847e25843d488e455a268a3e979 e4f95a656b7747af9d6f0a5daed90047 RX(phi₁₇) d64d1847e25843d488e455a268a3e979--e4f95a656b7747af9d6f0a5daed90047 c27f2b937b874b72a6d5f0be8ee2949e e4f95a656b7747af9d6f0a5daed90047--c27f2b937b874b72a6d5f0be8ee2949e fd10641d57c64d1dbae612cc8c228af4 PHASE(phi_ent₃) c27f2b937b874b72a6d5f0be8ee2949e--fd10641d57c64d1dbae612cc8c228af4 fd10641d57c64d1dbae612cc8c228af4--cd13d5aff36847878320b6179b2ae689 fd10641d57c64d1dbae612cc8c228af4--252ec8e1c90642718a9379552bcc7799

Having a truly hardware-efficient ansatz means that the entangling operation can be chosen according to each device's native interactions. Besides digital operations, in Qadence it is also possible to build digital-analog HEAs with the entanglement produced by the natural evolution of a set of interacting qubits, as natively implemented in neutral atom devices. As with other digital-analog functions, this can be controlled with the strategy argument which can be chosen from the Strategy enum type. Currently, only Strategy.DIGITAL and Strategy.SDAQC are available. By default, calling strategy = Strategy.SDAQC will use a global entangling Hamiltonian with Ising-like NN interactions and constant interaction strength,

from qadence import Strategy

ansatz = hea(
    n_qubits,
    depth=depth,
    strategy=Strategy.SDAQC
)
cluster_66a265c88ed146de8867282dd9a7bc03 cluster_ea8ec54191a64bc5bff5cbc635ea4863 ab5bf090df20475ab85bd727c48fc53c 0 f070f7ec28824e6bb60021b1ada807eb RX(theta₀) ab5bf090df20475ab85bd727c48fc53c--f070f7ec28824e6bb60021b1ada807eb 0667e2ee1b474beb84916fb8d0d0adbf 1 f29d0a0545f443108a00bac940812878 RY(theta₃) f070f7ec28824e6bb60021b1ada807eb--f29d0a0545f443108a00bac940812878 94f2b4f371d74ee39017fcd2ef73128e RX(theta₆) f29d0a0545f443108a00bac940812878--94f2b4f371d74ee39017fcd2ef73128e a89da0ceb3c840c5b674e0a4b795d2f1 HamEvo 94f2b4f371d74ee39017fcd2ef73128e--a89da0ceb3c840c5b674e0a4b795d2f1 445510b7872d4e068267f77f7179bd61 RX(theta₉) a89da0ceb3c840c5b674e0a4b795d2f1--445510b7872d4e068267f77f7179bd61 512fd4f3e4254f9285c8364fd4d1a3ad RY(theta₁₂) 445510b7872d4e068267f77f7179bd61--512fd4f3e4254f9285c8364fd4d1a3ad c514792c722344bd9bf127148e0533f7 RX(theta₁₅) 512fd4f3e4254f9285c8364fd4d1a3ad--c514792c722344bd9bf127148e0533f7 bccc383862bd473db4b0402033ccb50d HamEvo c514792c722344bd9bf127148e0533f7--bccc383862bd473db4b0402033ccb50d 530677a6ea754c75acb4c18d8a3ecc35 bccc383862bd473db4b0402033ccb50d--530677a6ea754c75acb4c18d8a3ecc35 91b73ee7ff2c40858f411832c3013240 7448ee6908d241c381bdb4e1db7ad2ca RX(theta₁) 0667e2ee1b474beb84916fb8d0d0adbf--7448ee6908d241c381bdb4e1db7ad2ca 598fb3b91bc5452c80cad5752df887ae 2 5f6f4bd060ce4fb0ad3adbebd20ebbf1 RY(theta₄) 7448ee6908d241c381bdb4e1db7ad2ca--5f6f4bd060ce4fb0ad3adbebd20ebbf1 866166c93c8f43ae8b945f6d19d970c0 RX(theta₇) 5f6f4bd060ce4fb0ad3adbebd20ebbf1--866166c93c8f43ae8b945f6d19d970c0 c3afd1a582934a958cc7d2fc16e8c891 t = theta_t₀ 866166c93c8f43ae8b945f6d19d970c0--c3afd1a582934a958cc7d2fc16e8c891 abc444a48545435dbc57e366a6dd000b RX(theta₁₀) c3afd1a582934a958cc7d2fc16e8c891--abc444a48545435dbc57e366a6dd000b 4b6ba5b2ba9c4ef3a8e1828d3502cf50 RY(theta₁₃) abc444a48545435dbc57e366a6dd000b--4b6ba5b2ba9c4ef3a8e1828d3502cf50 e19726347e434a50ad302322d9a30e1c RX(theta₁₆) 4b6ba5b2ba9c4ef3a8e1828d3502cf50--e19726347e434a50ad302322d9a30e1c c20256d0114f45628ca5cb8849848b75 t = theta_t₁ e19726347e434a50ad302322d9a30e1c--c20256d0114f45628ca5cb8849848b75 c20256d0114f45628ca5cb8849848b75--91b73ee7ff2c40858f411832c3013240 40006e5b0c444aa8a638a152650c96fd 510d70b0f41748469a065bdcc9bbe1ed RX(theta₂) 598fb3b91bc5452c80cad5752df887ae--510d70b0f41748469a065bdcc9bbe1ed 2c2158e4c89a4e57893e592883df9828 RY(theta₅) 510d70b0f41748469a065bdcc9bbe1ed--2c2158e4c89a4e57893e592883df9828 09e032503720436d82bb7d12b9419b94 RX(theta₈) 2c2158e4c89a4e57893e592883df9828--09e032503720436d82bb7d12b9419b94 e87624b620e744d2885b0b2020df77af 09e032503720436d82bb7d12b9419b94--e87624b620e744d2885b0b2020df77af a8ab4a6902d14161b647d9d88c090765 RX(theta₁₁) e87624b620e744d2885b0b2020df77af--a8ab4a6902d14161b647d9d88c090765 2f123c18d4c34f85bfe2dbaf1c453bf1 RY(theta₁₄) a8ab4a6902d14161b647d9d88c090765--2f123c18d4c34f85bfe2dbaf1c453bf1 99fd44d783294d6a87981873e46191ca RX(theta₁₇) 2f123c18d4c34f85bfe2dbaf1c453bf1--99fd44d783294d6a87981873e46191ca 88ef9cf8b75d4e8eb39127d5ee941d09 99fd44d783294d6a87981873e46191ca--88ef9cf8b75d4e8eb39127d5ee941d09 88ef9cf8b75d4e8eb39127d5ee941d09--40006e5b0c444aa8a638a152650c96fd

Note that, by default, only the time-parameter is automatically parameterized when building a digital-analog HEA. However, as described in the Hamiltonians tutorial, arbitrary interaction Hamiltonians can be easily built with the hamiltonian_factory function, with both customized or fully parameterized interactions, and these can be directly passed as the entangler for a customizable digital-analog HEA.

from qadence import hamiltonian_factory, Interaction, N, Register, hea

# Build a parameterized neutral-atom Hamiltonian following a honeycomb_lattice:
register = Register.honeycomb_lattice(1, 1)

entangler = hamiltonian_factory(
    register,
    interaction=Interaction.NN,
    detuning=N,
    interaction_strength="e",
    detuning_strength="n"
)

# Build a fully parameterized Digital-Analog HEA:
n_qubits = register.n_qubits
depth = 2

ansatz = hea(
    n_qubits=register.n_qubits,
    depth=depth,
    operations=[RX, RY, RX],
    entangler=entangler,
    strategy=Strategy.SDAQC
)
cluster_2772b3f399814a7995d211f3e7baa5e2 cluster_ff8cf9afeef54f4a84aa85c766958151 62d660c42a174cc28f4a7bd38d62b5cc 0 73689367c80c4f7098da15c0e3cc41fb RX(theta₀) 62d660c42a174cc28f4a7bd38d62b5cc--73689367c80c4f7098da15c0e3cc41fb f5eaec2e61054733afad1718f8712d2b 1 1fc20b617d9b42afb9b89fde56892ce7 RY(theta₆) 73689367c80c4f7098da15c0e3cc41fb--1fc20b617d9b42afb9b89fde56892ce7 40b3d757c0f74ceca19c6a74e2d3f1d2 RX(theta₁₂) 1fc20b617d9b42afb9b89fde56892ce7--40b3d757c0f74ceca19c6a74e2d3f1d2 c643e41d75e84859b8acd33d064525d1 40b3d757c0f74ceca19c6a74e2d3f1d2--c643e41d75e84859b8acd33d064525d1 f8e88fca48474122aa6e2c33a03f5d16 RX(theta₁₈) c643e41d75e84859b8acd33d064525d1--f8e88fca48474122aa6e2c33a03f5d16 1c7d420d9e114efe9ff00603cfae3092 RY(theta₂₄) f8e88fca48474122aa6e2c33a03f5d16--1c7d420d9e114efe9ff00603cfae3092 6f0fe753a5a942369f1fb2157d1ff918 RX(theta₃₀) 1c7d420d9e114efe9ff00603cfae3092--6f0fe753a5a942369f1fb2157d1ff918 95a6d499adc04d8a9512b51386dd4460 6f0fe753a5a942369f1fb2157d1ff918--95a6d499adc04d8a9512b51386dd4460 b9ce345f5ee5448c8ac1367570b31277 95a6d499adc04d8a9512b51386dd4460--b9ce345f5ee5448c8ac1367570b31277 4c62a55185614956a67ae275b8349983 05ad9a927f6d4ac8ab36b14ca01bec0b RX(theta₁) f5eaec2e61054733afad1718f8712d2b--05ad9a927f6d4ac8ab36b14ca01bec0b 478a29df74a541fabaa866c82748d7e2 2 52cb5f24400b4a0c886b4c8ec113c74d RY(theta₇) 05ad9a927f6d4ac8ab36b14ca01bec0b--52cb5f24400b4a0c886b4c8ec113c74d 634c9561c2514a77b164d09945547ed8 RX(theta₁₃) 52cb5f24400b4a0c886b4c8ec113c74d--634c9561c2514a77b164d09945547ed8 5172c445bb6549c1b48e95bcce17b2d3 634c9561c2514a77b164d09945547ed8--5172c445bb6549c1b48e95bcce17b2d3 23eb7247d84f4ac5b323132f71459e8a RX(theta₁₉) 5172c445bb6549c1b48e95bcce17b2d3--23eb7247d84f4ac5b323132f71459e8a 3bffdb89dd474f1fae5deba5db8d5b34 RY(theta₂₅) 23eb7247d84f4ac5b323132f71459e8a--3bffdb89dd474f1fae5deba5db8d5b34 c206cc6b5e4c458983b2b50155662c25 RX(theta₃₁) 3bffdb89dd474f1fae5deba5db8d5b34--c206cc6b5e4c458983b2b50155662c25 c2ad435c443d40529f5950cc90613119 c206cc6b5e4c458983b2b50155662c25--c2ad435c443d40529f5950cc90613119 c2ad435c443d40529f5950cc90613119--4c62a55185614956a67ae275b8349983 38fb483ec19944ca86b5f4c1c8b00ac3 c3cc05fe9b6f45daa4585b807e0108b8 RX(theta₂) 478a29df74a541fabaa866c82748d7e2--c3cc05fe9b6f45daa4585b807e0108b8 813e66547d0948fd879343f068d4fa6f 3 bd5b6fbb4ebf4b3e86995b819bcb94c9 RY(theta₈) c3cc05fe9b6f45daa4585b807e0108b8--bd5b6fbb4ebf4b3e86995b819bcb94c9 6a3c31f9ff864a34b42df92b982706ae RX(theta₁₄) bd5b6fbb4ebf4b3e86995b819bcb94c9--6a3c31f9ff864a34b42df92b982706ae fabdd7a7d24842bb95de3b35f7dd4f5b HamEvo 6a3c31f9ff864a34b42df92b982706ae--fabdd7a7d24842bb95de3b35f7dd4f5b 6ad6af69acb048a19e87ab48ccd681d2 RX(theta₂₀) fabdd7a7d24842bb95de3b35f7dd4f5b--6ad6af69acb048a19e87ab48ccd681d2 faae49e287e24fcdb3026562f4fa94de RY(theta₂₆) 6ad6af69acb048a19e87ab48ccd681d2--faae49e287e24fcdb3026562f4fa94de ec05f17bca2d4d1495c0e94010e563e3 RX(theta₃₂) faae49e287e24fcdb3026562f4fa94de--ec05f17bca2d4d1495c0e94010e563e3 fd4cfa04a19c41e0abb34e6af8d2b139 HamEvo ec05f17bca2d4d1495c0e94010e563e3--fd4cfa04a19c41e0abb34e6af8d2b139 fd4cfa04a19c41e0abb34e6af8d2b139--38fb483ec19944ca86b5f4c1c8b00ac3 a3fdf60019ae4a37987cc1a980711962 a64cadf7848a42a49760b4add57f008e RX(theta₃) 813e66547d0948fd879343f068d4fa6f--a64cadf7848a42a49760b4add57f008e 6c2e3290d0e34c8b9145958047df079a 4 1d5dafd3af544f6a8e48c536784bdc7c RY(theta₉) a64cadf7848a42a49760b4add57f008e--1d5dafd3af544f6a8e48c536784bdc7c 21fac1a114ef47ee9960215fcc767b7c RX(theta₁₅) 1d5dafd3af544f6a8e48c536784bdc7c--21fac1a114ef47ee9960215fcc767b7c e19ba2bb30634fdf8c2eb923e18aff37 t = theta_t₀ 21fac1a114ef47ee9960215fcc767b7c--e19ba2bb30634fdf8c2eb923e18aff37 44c98a70d7b944c2910d7441bebe027c RX(theta₂₁) e19ba2bb30634fdf8c2eb923e18aff37--44c98a70d7b944c2910d7441bebe027c f95f85b9fa074f17be8cc03bf6971667 RY(theta₂₇) 44c98a70d7b944c2910d7441bebe027c--f95f85b9fa074f17be8cc03bf6971667 e742bb26364c4397921d5d467e120352 RX(theta₃₃) f95f85b9fa074f17be8cc03bf6971667--e742bb26364c4397921d5d467e120352 9c09caa199df48e385508c7db562480b t = theta_t₁ e742bb26364c4397921d5d467e120352--9c09caa199df48e385508c7db562480b 9c09caa199df48e385508c7db562480b--a3fdf60019ae4a37987cc1a980711962 8ed6c432689c4d1ea98f4971a5d4e482 2ac6ddfd0f47496ea8ebdf7bcb6b3578 RX(theta₄) 6c2e3290d0e34c8b9145958047df079a--2ac6ddfd0f47496ea8ebdf7bcb6b3578 335f493d9f9d455783ba0f6db791135b 5 5ff075af9833449ea911465fd5ccc9e1 RY(theta₁₀) 2ac6ddfd0f47496ea8ebdf7bcb6b3578--5ff075af9833449ea911465fd5ccc9e1 335dc482110449d4a189fc0f7e41ade1 RX(theta₁₆) 5ff075af9833449ea911465fd5ccc9e1--335dc482110449d4a189fc0f7e41ade1 f1c6aea5ac2f4922ac2965fb8175c0f1 335dc482110449d4a189fc0f7e41ade1--f1c6aea5ac2f4922ac2965fb8175c0f1 42a76de7dde44e7ebbeb09b680ea0a5e RX(theta₂₂) f1c6aea5ac2f4922ac2965fb8175c0f1--42a76de7dde44e7ebbeb09b680ea0a5e 54bbd1a0d7f74586ba26374a3174b489 RY(theta₂₈) 42a76de7dde44e7ebbeb09b680ea0a5e--54bbd1a0d7f74586ba26374a3174b489 3f8c8734cccb4922865b0fc37e0a517b RX(theta₃₄) 54bbd1a0d7f74586ba26374a3174b489--3f8c8734cccb4922865b0fc37e0a517b ca773bdf04c24a8ab838357f836bbe29 3f8c8734cccb4922865b0fc37e0a517b--ca773bdf04c24a8ab838357f836bbe29 ca773bdf04c24a8ab838357f836bbe29--8ed6c432689c4d1ea98f4971a5d4e482 1033f53960b14b9d8f0717a4b6e4cc2d 396ebd6eee124c1797671cafcaf71ec8 RX(theta₅) 335f493d9f9d455783ba0f6db791135b--396ebd6eee124c1797671cafcaf71ec8 46817a1b71ef4bdea42b43df64dead25 RY(theta₁₁) 396ebd6eee124c1797671cafcaf71ec8--46817a1b71ef4bdea42b43df64dead25 b0d66647672343fb88457a80504cf415 RX(theta₁₇) 46817a1b71ef4bdea42b43df64dead25--b0d66647672343fb88457a80504cf415 9a9cd22da10c43d9abc0502d06c9f96d b0d66647672343fb88457a80504cf415--9a9cd22da10c43d9abc0502d06c9f96d 151df55a92a142a6b0c142c6f88a463a RX(theta₂₃) 9a9cd22da10c43d9abc0502d06c9f96d--151df55a92a142a6b0c142c6f88a463a 88873e5c95894c13b3ed05668c89888a RY(theta₂₉) 151df55a92a142a6b0c142c6f88a463a--88873e5c95894c13b3ed05668c89888a a9eb55df018443879dc65a2291231c85 RX(theta₃₅) 88873e5c95894c13b3ed05668c89888a--a9eb55df018443879dc65a2291231c85 35f151fa32a04b1f99671eafd9e49a03 a9eb55df018443879dc65a2291231c85--35f151fa32a04b1f99671eafd9e49a03 35f151fa32a04b1f99671eafd9e49a03--1033f53960b14b9d8f0717a4b6e4cc2d

Machine Learning Tools

Dataloaders

When using qadence, you can supply classical data to a quantum machine learning algorithm by using a standard PyTorch DataLoader instance. Qadence also provides the DictDataLoader convenience class which allows to build dictionaries of DataLoaders instances and easily iterate over them.

import torch
from torch.utils.data import DataLoader, TensorDataset
from qadence.ml_tools import DictDataLoader

def dataloader() -> DataLoader:
    batch_size = 5
    x = torch.linspace(0, 1, batch_size).reshape(-1, 1)
    y = torch.sin(x)

    dataset = TensorDataset(x, y)
    return DataLoader(dataset, batch_size=batch_size)


def dictdataloader() -> DictDataLoader:
    batch_size = 5

    keys = ["y1", "y2"]
    dls = {}
    for k in keys:
        x = torch.rand(batch_size, 1)
        y = torch.sin(x)
        dataset = TensorDataset(x, y)
        dataloader = DataLoader(dataset, batch_size=batch_size)
        dls[k] = dataloader

    return DictDataLoader(dls)

n_epochs = 2

# iterate standard DataLoader
dl = dataloader()
for i in range(n_epochs):
    data = next(iter(dl))

# iterate DictDataLoader
ddl = dictdataloader()
for i in range(n_epochs):
    data = next(iter(ddl))

Optimization routines

For training QML models, qadence also offers a few out-of-the-box routines for optimizing differentiable models like QNNs and QuantumModels containing either trainable and/or non-trainable parameters (you can refer to this for a refresh about different parameter types):

These routines performs training, logging/printing loss metrics and storing intermediate checkpoints of models. In the following, we use train_with_grad as example but the code can be used directly with the gradient-free routine.

As every other training routine commonly used in Machine Learning, it requires model, data and an optimizer as input arguments. However, in addition, it requires a loss_fn and a TrainConfig. A loss_fn is required to be a function which expects both a model and data and returns a tuple of (loss, metrics: <dict>), where metrics is a dict of scalars which can be customized too.

import torch
from itertools import count
cnt = count()
criterion = torch.nn.MSELoss()

def loss_fn(model: torch.nn.Module, data: torch.Tensor) -> tuple[torch.Tensor, dict]:
    next(cnt)
    x, y = data[0], data[1]
    out = model(x)
    loss = criterion(out, y)
    return loss, {}

The TrainConfig tells train_with_grad what batch_size should be used, how many epochs to train, in which intervals to print/log metrics and how often to store intermediate checkpoints.

from qadence.ml_tools import TrainConfig

batch_size = 5
n_epochs = 100

config = TrainConfig(
    folder="some_path/",
    max_iter=n_epochs,
    checkpoint_every=100,
    write_every=100,
    batch_size=batch_size,
)

Let's see it in action with a simple example.

Fitting a funtion with a QNN using ml_tools

Let's look at a complete example of how to use train_with_grad now.

from pathlib import Path
import torch
from itertools import count
from qadence.constructors import hamiltonian_factory, hea, feature_map
from qadence import chain, Parameter, QuantumCircuit, Z
from qadence.models import QNN
from qadence.ml_tools import train_with_grad, TrainConfig
import matplotlib.pyplot as plt

n_qubits = 2
fm = feature_map(n_qubits)
ansatz = hea(n_qubits=n_qubits, depth=3)
observable = hamiltonian_factory(n_qubits, detuning=Z)
circuit = QuantumCircuit(n_qubits, fm, ansatz)

model = QNN(circuit, observable, backend="pyqtorch", diff_mode="ad")
batch_size = 1
input_values = {"phi": torch.rand(batch_size, requires_grad=True)}
pred = model(input_values)

cnt = count()
criterion = torch.nn.MSELoss()
optimizer = torch.optim.Adam(model.parameters(), lr=0.1)

def loss_fn(model: torch.nn.Module, data: torch.Tensor) -> tuple[torch.Tensor, dict]:
    next(cnt)
    x, y = data[0], data[1]
    out = model(x)
    loss = criterion(out, y)
    return loss, {}

tmp_path = Path("/tmp")

n_epochs = 5

config = TrainConfig(
    folder=tmp_path,
    max_iter=n_epochs,
    checkpoint_every=100,
    write_every=100,
    batch_size=batch_size,
)

batch_size = 25

x = torch.linspace(0, 1, batch_size).reshape(-1, 1)
y = torch.sin(x)

train_with_grad(model, (x, y), optimizer, config, loss_fn=loss_fn)

plt.plot(y.numpy())
plt.plot(model(input_values).detach().numpy())

For users who want to use the low-level API of qadence, here is the example from above written without train_with_grad.

Fitting a function - Low-level API

from pathlib import Path
import torch
from itertools import count
from qadence.constructors import hamiltonian_factory, hea, feature_map
from qadence import chain, Parameter, QuantumCircuit, Z
from qadence.models import QNN
from qadence.ml_tools import train_with_grad, TrainConfig

n_qubits = 2
fm = feature_map(n_qubits)
ansatz = hea(n_qubits=n_qubits, depth=3)
observable = hamiltonian_factory(n_qubits, detuning=Z)
circuit = QuantumCircuit(n_qubits, fm, ansatz)

model = QNN(circuit, observable, backend="pyqtorch", diff_mode="ad")
batch_size = 1
input_values = {"phi": torch.rand(batch_size, requires_grad=True)}
pred = model(input_values)

criterion = torch.nn.MSELoss()
optimizer = torch.optim.Adam(model.parameters(), lr=0.1)
n_epochs=50
cnt = count()

tmp_path = Path("/tmp")

config = TrainConfig(
    folder=tmp_path,
    max_iter=n_epochs,
    checkpoint_every=100,
    write_every=100,
    batch_size=batch_size,
)

x = torch.linspace(0, 1, batch_size).reshape(-1, 1)
y = torch.sin(x)

for i in range(n_epochs):
    out = model(x)
    loss = criterion(out, y)
    loss.backward()
    optimizer.step()