Tools for quantum machine learning
Qadence offers a wide range of utilities for helping building and researching
quantum machine learning algorithms, including:
a set of constructors for circuits commonly used in quantum machine learning
a set of tools for optimizing quantum neural networks and loading classical data into a QML algorithm
Quantum machine learning constructors
Besides the arbitrary Hamiltonian constructors , Qadence also provides a complete set of
program constructors useful for digital-analog quantum machine learning programs.
Feature maps
A few feature maps are directly available for loading classical data into quantum circuits by encoding them
into gate rotation angles.
from qadence import feature_map
n_qubits = 3
fm = feature_map ( n_qubits , fm_type = "fourier" )
fm = feature_map ( n_qubits , fm_type = "chebyshev" )
fm = feature_map ( n_qubits , fm_type = "tower" )
Fourier = Kro n Block( 0 , 1 , 2 ) [ ta g : Co nstant Fourier FM ]
├── RX( 0 ) [ params : [ 'phi' ]]
├── RX( 1 ) [ params : [ 'phi' ]]
└── RX( 2 ) [ params : [ 'phi' ]]
Chebyshev Kro n Block( 0 , 1 , 2 ) [ ta g : Co nstant Chebyshev FM ]
├── RX( 0 ) [ params : [ 'acos(phi)' ]]
├── RX( 1 ) [ params : [ 'acos(phi)' ]]
└── RX( 2 ) [ params : [ 'acos(phi)' ]]
Tower Kro n Block( 0 , 1 , 2 ) [ ta g : Tower Chebyshev FM ]
├── RX( 0 ) [ params : [ ' 1 _ 0 *acos(phi)' ]]
├── RX( 1 ) [ params : [ ' 2 _ 0 *acos(phi)' ]]
└── RX( 2 ) [ params : [ ' 3 _ 0 *acos(phi)' ]]
Hardware-efficient ansatz
Ansatze blocks for quantum machine-learning are typically built following the Hardware-Efficient Ansatz formalism (HEA).
Both fully digital and digital-analog HEAs can easily be built with the hea
function. By default,
the digital version is returned:
from qadence import hea
from qadence.draw import display
n_qubits = 3
depth = 2
ansatz = hea ( n_qubits , depth )
%3
679764d795a447eab8f9083935b1146f
0
f3daf302b8ae46c28279d4461da109cc
RX(theta₀)
679764d795a447eab8f9083935b1146f--f3daf302b8ae46c28279d4461da109cc
d863a82b00af434aa9761f68de885517
1
bf87bb799a824d25b7d4b4df0375a552
RY(theta₃)
f3daf302b8ae46c28279d4461da109cc--bf87bb799a824d25b7d4b4df0375a552
3853b9e246824159b5f2d2cf842adcd0
RX(theta₆)
bf87bb799a824d25b7d4b4df0375a552--3853b9e246824159b5f2d2cf842adcd0
b75052383b7a4894a0e97ffdaf151644
3853b9e246824159b5f2d2cf842adcd0--b75052383b7a4894a0e97ffdaf151644
f4f577a53756489aa1e716e52a1d4031
b75052383b7a4894a0e97ffdaf151644--f4f577a53756489aa1e716e52a1d4031
ef6fe76537e943dd8c5caebf32bff68b
RX(theta₉)
f4f577a53756489aa1e716e52a1d4031--ef6fe76537e943dd8c5caebf32bff68b
4458f724c70f4df39cd943d517441d5c
RY(theta₁₂)
ef6fe76537e943dd8c5caebf32bff68b--4458f724c70f4df39cd943d517441d5c
df523b322e564fb8a1edbc21ba56164f
RX(theta₁₅)
4458f724c70f4df39cd943d517441d5c--df523b322e564fb8a1edbc21ba56164f
3a38bf56b06c43f0b96baf4369ec91d2
df523b322e564fb8a1edbc21ba56164f--3a38bf56b06c43f0b96baf4369ec91d2
41bc617668584cc2bb92651c570126d4
3a38bf56b06c43f0b96baf4369ec91d2--41bc617668584cc2bb92651c570126d4
66be092eb2134d36b6ff8e40bd556858
41bc617668584cc2bb92651c570126d4--66be092eb2134d36b6ff8e40bd556858
72879ed255624ceaad310177bf9b269d
4ead801d2def400a85a1f52a77482b8c
RX(theta₁)
d863a82b00af434aa9761f68de885517--4ead801d2def400a85a1f52a77482b8c
bcace2ecac254226af0b9cc345fc0dbf
2
3f27db8ee4a34ca18e94428d1eb9e965
RY(theta₄)
4ead801d2def400a85a1f52a77482b8c--3f27db8ee4a34ca18e94428d1eb9e965
458c7612c3894b3f8a4394a8e3c3f6cc
RX(theta₇)
3f27db8ee4a34ca18e94428d1eb9e965--458c7612c3894b3f8a4394a8e3c3f6cc
835b5e9e7f374aa6aa1a4707d1020d23
X
458c7612c3894b3f8a4394a8e3c3f6cc--835b5e9e7f374aa6aa1a4707d1020d23
835b5e9e7f374aa6aa1a4707d1020d23--b75052383b7a4894a0e97ffdaf151644
7774022a70724c0ab705b86eaefe4740
835b5e9e7f374aa6aa1a4707d1020d23--7774022a70724c0ab705b86eaefe4740
347f2b2c1f7e486da465d6b87e3cdb70
RX(theta₁₀)
7774022a70724c0ab705b86eaefe4740--347f2b2c1f7e486da465d6b87e3cdb70
3db478e7fe2b49aa8e34758c9729c1c9
RY(theta₁₃)
347f2b2c1f7e486da465d6b87e3cdb70--3db478e7fe2b49aa8e34758c9729c1c9
3a75940b97004ea8b7dbb86f193542e5
RX(theta₁₆)
3db478e7fe2b49aa8e34758c9729c1c9--3a75940b97004ea8b7dbb86f193542e5
4e9e3a56264a469f8eaf57bfdf2ddab1
X
3a75940b97004ea8b7dbb86f193542e5--4e9e3a56264a469f8eaf57bfdf2ddab1
4e9e3a56264a469f8eaf57bfdf2ddab1--3a38bf56b06c43f0b96baf4369ec91d2
4d01af5007834fee8a838e75c19e8bd5
4e9e3a56264a469f8eaf57bfdf2ddab1--4d01af5007834fee8a838e75c19e8bd5
4d01af5007834fee8a838e75c19e8bd5--72879ed255624ceaad310177bf9b269d
8e73862bb45d4807847f4a55dea3df4d
2a84ea4d463641b58f1be1d20023333d
RX(theta₂)
bcace2ecac254226af0b9cc345fc0dbf--2a84ea4d463641b58f1be1d20023333d
f32c821a3f9a4be9973629f626489fdc
RY(theta₅)
2a84ea4d463641b58f1be1d20023333d--f32c821a3f9a4be9973629f626489fdc
b0d8bdfd98a743e7820839c677cbef30
RX(theta₈)
f32c821a3f9a4be9973629f626489fdc--b0d8bdfd98a743e7820839c677cbef30
b5fb17c770214286aba0d47039394a1b
b0d8bdfd98a743e7820839c677cbef30--b5fb17c770214286aba0d47039394a1b
7d35e428f40d41e18d6ab11f36b5b7dd
X
b5fb17c770214286aba0d47039394a1b--7d35e428f40d41e18d6ab11f36b5b7dd
7d35e428f40d41e18d6ab11f36b5b7dd--7774022a70724c0ab705b86eaefe4740
dcab188348ad48de912f6bbbf11ff272
RX(theta₁₁)
7d35e428f40d41e18d6ab11f36b5b7dd--dcab188348ad48de912f6bbbf11ff272
9f655ec419da4089a83772353c07230a
RY(theta₁₄)
dcab188348ad48de912f6bbbf11ff272--9f655ec419da4089a83772353c07230a
43b466ebdc7f4ba793d208a7a74494ea
RX(theta₁₇)
9f655ec419da4089a83772353c07230a--43b466ebdc7f4ba793d208a7a74494ea
f20864d5b9374c1ba1acebf531c22952
43b466ebdc7f4ba793d208a7a74494ea--f20864d5b9374c1ba1acebf531c22952
f6433a5e74d745f985b02bd0609d5599
X
f20864d5b9374c1ba1acebf531c22952--f6433a5e74d745f985b02bd0609d5599
f6433a5e74d745f985b02bd0609d5599--4d01af5007834fee8a838e75c19e8bd5
f6433a5e74d745f985b02bd0609d5599--8e73862bb45d4807847f4a55dea3df4d
As seen above, the rotation layers are automatically parameterized, and the prefix "theta"
can be changed with the param_prefix
argument.
Furthermore, both the single-qubit rotations and the two-qubit entangler can be customized with the operations
and entangler
argument. The operations can be passed as a list of single-qubit rotations, while the entangler should be either CNOT
, CZ
, CRX
, CRY
, CRZ
or CPHASE
.
from qadence import RX , RY , CPHASE
ansatz = hea (
n_qubits = n_qubits ,
depth = depth ,
param_prefix = "phi" ,
operations = [ RX , RY , RX ],
entangler = CPHASE
)
%3
44c3dceafa5d44fab2a8985dc46e2123
0
7e93250e489b4b69a0571d15df807488
RX(phi₀)
44c3dceafa5d44fab2a8985dc46e2123--7e93250e489b4b69a0571d15df807488
9bae0c624235485ab999ce5f529ca1f8
1
209423b5d9e24f86a152a1190b537d21
RY(phi₃)
7e93250e489b4b69a0571d15df807488--209423b5d9e24f86a152a1190b537d21
9d8243eddd5044b0b790ef9dd3d4e049
RX(phi₆)
209423b5d9e24f86a152a1190b537d21--9d8243eddd5044b0b790ef9dd3d4e049
a38cee1039ba4f0bb79c57451b594465
9d8243eddd5044b0b790ef9dd3d4e049--a38cee1039ba4f0bb79c57451b594465
ca90c759df7a436c89d374ee6bc66f7a
a38cee1039ba4f0bb79c57451b594465--ca90c759df7a436c89d374ee6bc66f7a
b407067b591246ddb6b502b69278118c
RX(phi₉)
ca90c759df7a436c89d374ee6bc66f7a--b407067b591246ddb6b502b69278118c
c06da2eff51648ef90bd39a03996ce90
RY(phi₁₂)
b407067b591246ddb6b502b69278118c--c06da2eff51648ef90bd39a03996ce90
22240678eac148aa81d437da2ee4428a
RX(phi₁₅)
c06da2eff51648ef90bd39a03996ce90--22240678eac148aa81d437da2ee4428a
44f3861dc13e417bb04e8d6a1e039680
22240678eac148aa81d437da2ee4428a--44f3861dc13e417bb04e8d6a1e039680
23597a16f6e543e8bb8c0822f394a18c
44f3861dc13e417bb04e8d6a1e039680--23597a16f6e543e8bb8c0822f394a18c
efa907bcc4ff489d8ab1a778fd49c727
23597a16f6e543e8bb8c0822f394a18c--efa907bcc4ff489d8ab1a778fd49c727
bd0d23bf2ca24ce3b26ca5b039c8f055
380f58fe1582466fbd748df0688b117f
RX(phi₁)
9bae0c624235485ab999ce5f529ca1f8--380f58fe1582466fbd748df0688b117f
9784a437abb442d9950efd89f48116c9
2
6929a1368a5a495f807a8988b8f83d97
RY(phi₄)
380f58fe1582466fbd748df0688b117f--6929a1368a5a495f807a8988b8f83d97
ab6583cba5d9401f80acf7f1252900d0
RX(phi₇)
6929a1368a5a495f807a8988b8f83d97--ab6583cba5d9401f80acf7f1252900d0
f259b1ae2c434fddbb532a4b7c54b080
PHASE(phi_ent₀)
ab6583cba5d9401f80acf7f1252900d0--f259b1ae2c434fddbb532a4b7c54b080
f259b1ae2c434fddbb532a4b7c54b080--a38cee1039ba4f0bb79c57451b594465
57b1f69c6c524a19b8de7926397aa23a
f259b1ae2c434fddbb532a4b7c54b080--57b1f69c6c524a19b8de7926397aa23a
4ed4739666d245d8b3dd1d8c68bca1f4
RX(phi₁₀)
57b1f69c6c524a19b8de7926397aa23a--4ed4739666d245d8b3dd1d8c68bca1f4
04ccb86a9835467d948c6d8f290561ee
RY(phi₁₃)
4ed4739666d245d8b3dd1d8c68bca1f4--04ccb86a9835467d948c6d8f290561ee
59ea5537da064e699b7b5d72343cb9df
RX(phi₁₆)
04ccb86a9835467d948c6d8f290561ee--59ea5537da064e699b7b5d72343cb9df
197ac493b3db478fbf2a3c0e296fecf3
PHASE(phi_ent₂)
59ea5537da064e699b7b5d72343cb9df--197ac493b3db478fbf2a3c0e296fecf3
197ac493b3db478fbf2a3c0e296fecf3--44f3861dc13e417bb04e8d6a1e039680
83fed5a667844db5a9b2a12a9afb4341
197ac493b3db478fbf2a3c0e296fecf3--83fed5a667844db5a9b2a12a9afb4341
83fed5a667844db5a9b2a12a9afb4341--bd0d23bf2ca24ce3b26ca5b039c8f055
74da340f369f4d1e96b8412b9fea794f
676db064d7434d1e97f447798006bc94
RX(phi₂)
9784a437abb442d9950efd89f48116c9--676db064d7434d1e97f447798006bc94
3cca5ca3ebf144828130520931341339
RY(phi₅)
676db064d7434d1e97f447798006bc94--3cca5ca3ebf144828130520931341339
712db55a7c864a88812ba9ad84b638fb
RX(phi₈)
3cca5ca3ebf144828130520931341339--712db55a7c864a88812ba9ad84b638fb
4df51d53263844c3a311f118a27be645
712db55a7c864a88812ba9ad84b638fb--4df51d53263844c3a311f118a27be645
e2f3f6cdf79c4cd78ff4b9657606adc9
PHASE(phi_ent₁)
4df51d53263844c3a311f118a27be645--e2f3f6cdf79c4cd78ff4b9657606adc9
e2f3f6cdf79c4cd78ff4b9657606adc9--57b1f69c6c524a19b8de7926397aa23a
213fea7656da4a78a092c3dd8eb7ae7f
RX(phi₁₁)
e2f3f6cdf79c4cd78ff4b9657606adc9--213fea7656da4a78a092c3dd8eb7ae7f
d63df83e5f5a4853b73b76c7af29b2c7
RY(phi₁₄)
213fea7656da4a78a092c3dd8eb7ae7f--d63df83e5f5a4853b73b76c7af29b2c7
2be8d4309506400283387eb075bd6895
RX(phi₁₇)
d63df83e5f5a4853b73b76c7af29b2c7--2be8d4309506400283387eb075bd6895
af6d44869aa84bfa878f1ddc78499e84
2be8d4309506400283387eb075bd6895--af6d44869aa84bfa878f1ddc78499e84
071ab36520014f16866e85169f56cc62
PHASE(phi_ent₃)
af6d44869aa84bfa878f1ddc78499e84--071ab36520014f16866e85169f56cc62
071ab36520014f16866e85169f56cc62--83fed5a667844db5a9b2a12a9afb4341
071ab36520014f16866e85169f56cc62--74da340f369f4d1e96b8412b9fea794f
Having a truly hardware-efficient ansatz means that the entangling operation can be chosen according to each device's native interactions. Besides digital operations, in Qadence it is also possible to build digital-analog HEAs with the entanglement produced by the natural evolution of a set of interacting qubits, as natively implemented in neutral atom devices. As with other digital-analog functions, this can be controlled with the strategy
argument which can be chosen from the Strategy
enum type. Currently, only Strategy.DIGITAL
and Strategy.SDAQC
are available. By default, calling strategy = Strategy.SDAQC
will use a global entangling Hamiltonian with Ising-like NN interactions and constant interaction strength,
from qadence import Strategy
ansatz = hea (
n_qubits ,
depth = depth ,
strategy = Strategy . SDAQC
)
%3
cluster_91070a282ac44dba90ad235f6b5794f4
cluster_b5312f749b554c18ab4ffe0798658940
bd6f26f9f07246d7b8d681d5086d6139
0
e5186aec317c470a8258f3ca2d3bdb3f
RX(theta₀)
bd6f26f9f07246d7b8d681d5086d6139--e5186aec317c470a8258f3ca2d3bdb3f
b792e8600cc34c89a7df11ba85f6c071
1
ca08af697bf743d292ce390ca966d2c6
RY(theta₃)
e5186aec317c470a8258f3ca2d3bdb3f--ca08af697bf743d292ce390ca966d2c6
223b0a7373b64c11bd9d5cd0ae187033
RX(theta₆)
ca08af697bf743d292ce390ca966d2c6--223b0a7373b64c11bd9d5cd0ae187033
2bb5280c240647e7af5dc21ca17f5930
HamEvo
223b0a7373b64c11bd9d5cd0ae187033--2bb5280c240647e7af5dc21ca17f5930
b92ec6dee87e44069c50928ed58a4b2d
RX(theta₉)
2bb5280c240647e7af5dc21ca17f5930--b92ec6dee87e44069c50928ed58a4b2d
202bc61aadf749c2856131013b392442
RY(theta₁₂)
b92ec6dee87e44069c50928ed58a4b2d--202bc61aadf749c2856131013b392442
df53ad07a46c492dad787fc4b0e8db03
RX(theta₁₅)
202bc61aadf749c2856131013b392442--df53ad07a46c492dad787fc4b0e8db03
d2826c2e107842798b059144df51edde
HamEvo
df53ad07a46c492dad787fc4b0e8db03--d2826c2e107842798b059144df51edde
02b06dd02c184c04bf21cc4f21f0390a
d2826c2e107842798b059144df51edde--02b06dd02c184c04bf21cc4f21f0390a
5a51b4646e72447d8156e34cb07470cb
2cef04fff2634b71a01b099b9b91cb9f
RX(theta₁)
b792e8600cc34c89a7df11ba85f6c071--2cef04fff2634b71a01b099b9b91cb9f
0c4798bc3cd64e3e81a0c22be35166cf
2
8a2f0e4aecbb4549a020d72aa1f0cb14
RY(theta₄)
2cef04fff2634b71a01b099b9b91cb9f--8a2f0e4aecbb4549a020d72aa1f0cb14
3a01fdc289c746a9a3940c055fed0116
RX(theta₇)
8a2f0e4aecbb4549a020d72aa1f0cb14--3a01fdc289c746a9a3940c055fed0116
01adffde729f4e1ea27d1381d94d06c0
t = theta_t₀
3a01fdc289c746a9a3940c055fed0116--01adffde729f4e1ea27d1381d94d06c0
5dba1a91669442d2b0cc3a022668d470
RX(theta₁₀)
01adffde729f4e1ea27d1381d94d06c0--5dba1a91669442d2b0cc3a022668d470
08dae9117e2f4ccfa6454b32eb92a1c2
RY(theta₁₃)
5dba1a91669442d2b0cc3a022668d470--08dae9117e2f4ccfa6454b32eb92a1c2
70699c2f371c4222b67661975aba7b6c
RX(theta₁₆)
08dae9117e2f4ccfa6454b32eb92a1c2--70699c2f371c4222b67661975aba7b6c
b136e9f829694aa8beb443bee22cfba4
t = theta_t₁
70699c2f371c4222b67661975aba7b6c--b136e9f829694aa8beb443bee22cfba4
b136e9f829694aa8beb443bee22cfba4--5a51b4646e72447d8156e34cb07470cb
5cd326c125fc40b39b1643db3b492b4e
1094eaf8253c4d7ab9ce5801dfe240a1
RX(theta₂)
0c4798bc3cd64e3e81a0c22be35166cf--1094eaf8253c4d7ab9ce5801dfe240a1
cd84ce9a9cfb4db19b1a5f2c3dcd7c54
RY(theta₅)
1094eaf8253c4d7ab9ce5801dfe240a1--cd84ce9a9cfb4db19b1a5f2c3dcd7c54
cf6e72e368154b45848616ea7eef32c4
RX(theta₈)
cd84ce9a9cfb4db19b1a5f2c3dcd7c54--cf6e72e368154b45848616ea7eef32c4
8e001bd8253645da817b69e75d92dff4
cf6e72e368154b45848616ea7eef32c4--8e001bd8253645da817b69e75d92dff4
281eaf734e5b405ea3572324d0bbf582
RX(theta₁₁)
8e001bd8253645da817b69e75d92dff4--281eaf734e5b405ea3572324d0bbf582
938f3ad1378e4afc9a564d4546a1cbde
RY(theta₁₄)
281eaf734e5b405ea3572324d0bbf582--938f3ad1378e4afc9a564d4546a1cbde
8830fc177b734514b1fa23df5d01d6ab
RX(theta₁₇)
938f3ad1378e4afc9a564d4546a1cbde--8830fc177b734514b1fa23df5d01d6ab
2fc2eee431fb44379c279d96a2a940ee
8830fc177b734514b1fa23df5d01d6ab--2fc2eee431fb44379c279d96a2a940ee
2fc2eee431fb44379c279d96a2a940ee--5cd326c125fc40b39b1643db3b492b4e
Note that, by default, only the time-parameter is automatically parameterized when building a digital-analog HEA. However, as described in the Hamiltonians tutorial , arbitrary interaction Hamiltonians can be easily built with the hamiltonian_factory
function, with both customized or fully parameterized interactions, and these can be directly passed as the entangler
for a customizable digital-analog HEA.
from qadence import hamiltonian_factory , Interaction , N , Register , hea
# Build a parameterized neutral-atom Hamiltonian following a honeycomb_lattice:
register = Register . honeycomb_lattice ( 1 , 1 )
entangler = hamiltonian_factory (
register ,
interaction = Interaction . NN ,
detuning = N ,
interaction_strength = "e" ,
detuning_strength = "n"
)
# Build a fully parameterized Digital-Analog HEA:
n_qubits = register . n_qubits
depth = 2
ansatz = hea (
n_qubits = register . n_qubits ,
depth = depth ,
operations = [ RX , RY , RX ],
entangler = entangler ,
strategy = Strategy . SDAQC
)
%3
cluster_641c66e4ff58478d974f97cd71a21045
cluster_2f81b32f4e1346ef934ce0ecbc34a8ad
784192b82e5d485c9af468c188cf6216
0
ab8bc7dcec17462c9099203db72ae52e
RX(theta₀)
784192b82e5d485c9af468c188cf6216--ab8bc7dcec17462c9099203db72ae52e
11a74a6e22a24c5483387d998e389280
1
e8b8292eb887476abdebe3b68336dd8e
RY(theta₆)
ab8bc7dcec17462c9099203db72ae52e--e8b8292eb887476abdebe3b68336dd8e
e5071b138c2546ff8b45dbe9ff669495
RX(theta₁₂)
e8b8292eb887476abdebe3b68336dd8e--e5071b138c2546ff8b45dbe9ff669495
d1666e630917412abe24d9bba9874be1
e5071b138c2546ff8b45dbe9ff669495--d1666e630917412abe24d9bba9874be1
94c8befdb9e1417f9596a39e44edea9d
RX(theta₁₈)
d1666e630917412abe24d9bba9874be1--94c8befdb9e1417f9596a39e44edea9d
669d5a348bfb417887a6d2fcf863baf9
RY(theta₂₄)
94c8befdb9e1417f9596a39e44edea9d--669d5a348bfb417887a6d2fcf863baf9
a86722d4543b4fd9b4447a5b912faa05
RX(theta₃₀)
669d5a348bfb417887a6d2fcf863baf9--a86722d4543b4fd9b4447a5b912faa05
f2855efdf0d24210b122ca4eca6074ca
a86722d4543b4fd9b4447a5b912faa05--f2855efdf0d24210b122ca4eca6074ca
c45b2306acba46b7b165c597b101fdcb
f2855efdf0d24210b122ca4eca6074ca--c45b2306acba46b7b165c597b101fdcb
0077dde88dad4a1290f096e24602ffa3
1eb7ab90ec3240fa9ec7b58df8cf76df
RX(theta₁)
11a74a6e22a24c5483387d998e389280--1eb7ab90ec3240fa9ec7b58df8cf76df
f2e55223aec74268b28fccfaa156f540
2
bba7e2eeecc4477b8c617234f735a6cd
RY(theta₇)
1eb7ab90ec3240fa9ec7b58df8cf76df--bba7e2eeecc4477b8c617234f735a6cd
78e79c981f7341c8976a1ef08fc460c8
RX(theta₁₃)
bba7e2eeecc4477b8c617234f735a6cd--78e79c981f7341c8976a1ef08fc460c8
c055ca2892814f33b5e8d0f0d094136d
78e79c981f7341c8976a1ef08fc460c8--c055ca2892814f33b5e8d0f0d094136d
25c2ac17242a4bacaa8f209caec43c78
RX(theta₁₉)
c055ca2892814f33b5e8d0f0d094136d--25c2ac17242a4bacaa8f209caec43c78
1e349dc50bca47f1a6d916cf0acf0f0e
RY(theta₂₅)
25c2ac17242a4bacaa8f209caec43c78--1e349dc50bca47f1a6d916cf0acf0f0e
f3a30a0bfa604ca78f5ff952a8511c28
RX(theta₃₁)
1e349dc50bca47f1a6d916cf0acf0f0e--f3a30a0bfa604ca78f5ff952a8511c28
ce1fb13e327940029e379c310e6eace7
f3a30a0bfa604ca78f5ff952a8511c28--ce1fb13e327940029e379c310e6eace7
ce1fb13e327940029e379c310e6eace7--0077dde88dad4a1290f096e24602ffa3
97035589de174e81a36c592433f63ffe
a08f76190b654b3c8c80083a1377a63f
RX(theta₂)
f2e55223aec74268b28fccfaa156f540--a08f76190b654b3c8c80083a1377a63f
33cd027d64224bccb31cb19a5518fc7e
3
9f75c12161ab450296a39449122db647
RY(theta₈)
a08f76190b654b3c8c80083a1377a63f--9f75c12161ab450296a39449122db647
cef003680cdd463e9dc9f9378612bc08
RX(theta₁₄)
9f75c12161ab450296a39449122db647--cef003680cdd463e9dc9f9378612bc08
6fa1d21849074439b23bf4040122d82f
HamEvo
cef003680cdd463e9dc9f9378612bc08--6fa1d21849074439b23bf4040122d82f
2e3df8f4cafc4bf494df7b590939bae1
RX(theta₂₀)
6fa1d21849074439b23bf4040122d82f--2e3df8f4cafc4bf494df7b590939bae1
51c9fc07f30941c893393f933317f847
RY(theta₂₆)
2e3df8f4cafc4bf494df7b590939bae1--51c9fc07f30941c893393f933317f847
0c67424d634a4e72b76d3c2e7d16a63e
RX(theta₃₂)
51c9fc07f30941c893393f933317f847--0c67424d634a4e72b76d3c2e7d16a63e
301cdde396174915bfed092c791de799
HamEvo
0c67424d634a4e72b76d3c2e7d16a63e--301cdde396174915bfed092c791de799
301cdde396174915bfed092c791de799--97035589de174e81a36c592433f63ffe
41a3a02b23a0442481632ecf2b1551f2
f671ae56a7454319b1cd33b9bff25cc3
RX(theta₃)
33cd027d64224bccb31cb19a5518fc7e--f671ae56a7454319b1cd33b9bff25cc3
98603726468949eab1b3a5aec70181f5
4
aaa640bb04144c54808ed585bfea6a05
RY(theta₉)
f671ae56a7454319b1cd33b9bff25cc3--aaa640bb04144c54808ed585bfea6a05
63bf6ed32c6e42bc844aaa68ce391ea4
RX(theta₁₅)
aaa640bb04144c54808ed585bfea6a05--63bf6ed32c6e42bc844aaa68ce391ea4
375f8aa1f2194821af241082dd0cf8f2
t = theta_t₀
63bf6ed32c6e42bc844aaa68ce391ea4--375f8aa1f2194821af241082dd0cf8f2
641ca5e1d2f1495dae4721af04257610
RX(theta₂₁)
375f8aa1f2194821af241082dd0cf8f2--641ca5e1d2f1495dae4721af04257610
21f4ec7696884899bd598cfe3fa50fc2
RY(theta₂₇)
641ca5e1d2f1495dae4721af04257610--21f4ec7696884899bd598cfe3fa50fc2
de797f4f1fe742759f1cc11065050a03
RX(theta₃₃)
21f4ec7696884899bd598cfe3fa50fc2--de797f4f1fe742759f1cc11065050a03
e6718d796fcc42109942be7f46f9ad83
t = theta_t₁
de797f4f1fe742759f1cc11065050a03--e6718d796fcc42109942be7f46f9ad83
e6718d796fcc42109942be7f46f9ad83--41a3a02b23a0442481632ecf2b1551f2
9621c63e92c14e6d9100223a77d6d73f
cf16e154768144a7a92d42d8a5d361c4
RX(theta₄)
98603726468949eab1b3a5aec70181f5--cf16e154768144a7a92d42d8a5d361c4
d8899c059ed44e6dbf6a9c14f6c498b1
5
1c112850a6424151aa9b590d8314ed42
RY(theta₁₀)
cf16e154768144a7a92d42d8a5d361c4--1c112850a6424151aa9b590d8314ed42
d3c5a0a8bf544fb89e2e492506fb2ed6
RX(theta₁₆)
1c112850a6424151aa9b590d8314ed42--d3c5a0a8bf544fb89e2e492506fb2ed6
03b8cb6a7d054ce59867f17552f63580
d3c5a0a8bf544fb89e2e492506fb2ed6--03b8cb6a7d054ce59867f17552f63580
34656ee8c0fa440fb563e2f14e2a194c
RX(theta₂₂)
03b8cb6a7d054ce59867f17552f63580--34656ee8c0fa440fb563e2f14e2a194c
1d3ec49a6cf74025bc5306ae43c17c1e
RY(theta₂₈)
34656ee8c0fa440fb563e2f14e2a194c--1d3ec49a6cf74025bc5306ae43c17c1e
3c3a0697c9084a0c90fe9a998b7acc30
RX(theta₃₄)
1d3ec49a6cf74025bc5306ae43c17c1e--3c3a0697c9084a0c90fe9a998b7acc30
36ecdb4c0c81443688d5ca75e58b4d85
3c3a0697c9084a0c90fe9a998b7acc30--36ecdb4c0c81443688d5ca75e58b4d85
36ecdb4c0c81443688d5ca75e58b4d85--9621c63e92c14e6d9100223a77d6d73f
04c0c6706d3444ec80054a069ed97bad
bec680a4104f4894996d1b62cb5b46c9
RX(theta₅)
d8899c059ed44e6dbf6a9c14f6c498b1--bec680a4104f4894996d1b62cb5b46c9
cc2dc21f17c14c5fbbc189c045082398
RY(theta₁₁)
bec680a4104f4894996d1b62cb5b46c9--cc2dc21f17c14c5fbbc189c045082398
ceded9dc698f47ec80e6e4ce94835f2d
RX(theta₁₇)
cc2dc21f17c14c5fbbc189c045082398--ceded9dc698f47ec80e6e4ce94835f2d
c62bc7420a8741a5b079ec58d478fec8
ceded9dc698f47ec80e6e4ce94835f2d--c62bc7420a8741a5b079ec58d478fec8
d62b3a5a018a4170a5e8d2b82cfa6300
RX(theta₂₃)
c62bc7420a8741a5b079ec58d478fec8--d62b3a5a018a4170a5e8d2b82cfa6300
b672c4e78ed4497ebfe636c8bfb8c2de
RY(theta₂₉)
d62b3a5a018a4170a5e8d2b82cfa6300--b672c4e78ed4497ebfe636c8bfb8c2de
a514f73402824887a066bd4c9472b7c6
RX(theta₃₅)
b672c4e78ed4497ebfe636c8bfb8c2de--a514f73402824887a066bd4c9472b7c6
2e54cc4e8a0b4c97affca7a6f579d371
a514f73402824887a066bd4c9472b7c6--2e54cc4e8a0b4c97affca7a6f579d371
2e54cc4e8a0b4c97affca7a6f579d371--04c0c6706d3444ec80054a069ed97bad
Dataloaders
When using qadence
, you can supply classical data to a quantum machine learning
algorithm by using a standard PyTorch DataLoader
instance. Qadence also provides
the DictDataLoader
convenience class which allows
to build dictionaries of DataLoader
s instances and easily iterate over them.
import torch
from torch.utils.data import DataLoader , TensorDataset
from qadence.ml_tools import DictDataLoader
def dataloader () -> DataLoader :
batch_size = 5
x = torch . linspace ( 0 , 1 , batch_size ) . reshape ( - 1 , 1 )
y = torch . sin ( x )
dataset = TensorDataset ( x , y )
return DataLoader ( dataset , batch_size = batch_size )
def dictdataloader () -> DictDataLoader :
batch_size = 5
keys = [ "y1" , "y2" ]
dls = {}
for k in keys :
x = torch . rand ( batch_size , 1 )
y = torch . sin ( x )
dataset = TensorDataset ( x , y )
dataloader = DataLoader ( dataset , batch_size = batch_size )
dls [ k ] = dataloader
return DictDataLoader ( dls )
n_epochs = 2
# iterate standard DataLoader
dl = dataloader ()
for i in range ( n_epochs ):
data = next ( iter ( dl ))
# iterate DictDataLoader
ddl = dictdataloader ()
for i in range ( n_epochs ):
data = next ( iter ( ddl ))
Optimization routines
For training QML models, qadence
also offers a few out-of-the-box routines for optimizing differentiable
models like QNN
s and QuantumModel
s containing either trainable and/or non-trainable parameters
(you can refer to this for a refresh about different parameter types):
These routines performs training, logging/printing loss metrics and storing intermediate checkpoints of models. In the following, we
use train_with_grad
as example but the code can be used directly with the gradient-free routine.
As every other training routine commonly used in Machine Learning, it requires
model
, data
and an optimizer
as input arguments.
However, in addition, it requires a loss_fn
and a TrainConfig
.
A loss_fn
is required to be a function which expects both a model and data and returns a tuple of (loss, metrics: <dict>
), where metrics
is a dict of scalars which can be customized too.
import torch
from itertools import count
cnt = count ()
criterion = torch . nn . MSELoss ()
def loss_fn ( model : torch . nn . Module , data : torch . Tensor ) -> tuple [ torch . Tensor , dict ]:
next ( cnt )
x , y = data [ 0 ], data [ 1 ]
out = model ( x )
loss = criterion ( out , y )
return loss , {}
The TrainConfig
tells train_with_grad
what batch_size should be used,
how many epochs to train, in which intervals to print/log metrics and how often to store intermediate checkpoints.
from qadence.ml_tools import TrainConfig
batch_size = 5
n_epochs = 100
config = TrainConfig (
folder = "some_path/" ,
max_iter = n_epochs ,
checkpoint_every = 100 ,
write_every = 100 ,
batch_size = batch_size ,
)
Let's see it in action with a simple example.
Let's look at a complete example of how to use train_with_grad
now.
from pathlib import Path
import torch
from itertools import count
from qadence.constructors import hamiltonian_factory , hea , feature_map
from qadence import chain , Parameter , QuantumCircuit , Z
from qadence.models import QNN
from qadence.ml_tools import train_with_grad , TrainConfig
import matplotlib.pyplot as plt
n_qubits = 2
fm = feature_map ( n_qubits )
ansatz = hea ( n_qubits = n_qubits , depth = 3 )
observable = hamiltonian_factory ( n_qubits , detuning = Z )
circuit = QuantumCircuit ( n_qubits , fm , ansatz )
model = QNN ( circuit , observable , backend = "pyqtorch" , diff_mode = "ad" )
batch_size = 1
input_values = { "phi" : torch . rand ( batch_size , requires_grad = True )}
pred = model ( input_values )
cnt = count ()
criterion = torch . nn . MSELoss ()
optimizer = torch . optim . Adam ( model . parameters (), lr = 0.1 )
def loss_fn ( model : torch . nn . Module , data : torch . Tensor ) -> tuple [ torch . Tensor , dict ]:
next ( cnt )
x , y = data [ 0 ], data [ 1 ]
out = model ( x )
loss = criterion ( out , y )
return loss , {}
tmp_path = Path ( "/tmp" )
n_epochs = 5
config = TrainConfig (
folder = tmp_path ,
max_iter = n_epochs ,
checkpoint_every = 100 ,
write_every = 100 ,
batch_size = batch_size ,
)
batch_size = 25
x = torch . linspace ( 0 , 1 , batch_size ) . reshape ( - 1 , 1 )
y = torch . sin ( x )
train_with_grad ( model , ( x , y ), optimizer , config , loss_fn = loss_fn )
plt . plot ( y . numpy ())
plt . plot ( model ( input_values ) . detach () . numpy ())
For users who want to use the low-level API of qadence
, here is the example from above
written without train_with_grad
.
Fitting a function - Low-level API
from pathlib import Path
import torch
from itertools import count
from qadence.constructors import hamiltonian_factory , hea , feature_map
from qadence import chain , Parameter , QuantumCircuit , Z
from qadence.models import QNN
from qadence.ml_tools import train_with_grad , TrainConfig
n_qubits = 2
fm = feature_map ( n_qubits )
ansatz = hea ( n_qubits = n_qubits , depth = 3 )
observable = hamiltonian_factory ( n_qubits , detuning = Z )
circuit = QuantumCircuit ( n_qubits , fm , ansatz )
model = QNN ( circuit , observable , backend = "pyqtorch" , diff_mode = "ad" )
batch_size = 1
input_values = { "phi" : torch . rand ( batch_size , requires_grad = True )}
pred = model ( input_values )
criterion = torch . nn . MSELoss ()
optimizer = torch . optim . Adam ( model . parameters (), lr = 0.1 )
n_epochs = 50
cnt = count ()
tmp_path = Path ( "/tmp" )
config = TrainConfig (
folder = tmp_path ,
max_iter = n_epochs ,
checkpoint_every = 100 ,
write_every = 100 ,
batch_size = batch_size ,
)
x = torch . linspace ( 0 , 1 , batch_size ) . reshape ( - 1 , 1 )
y = torch . sin ( x )
for i in range ( n_epochs ):
out = model ( x )
loss = criterion ( out , y )
loss . backward ()
optimizer . step ()