TY - GEN
T1 - APPLYING HYBRID QUANTUM LSTM FOR INDOOR LOCALIZATION BASED ON RSSI
AU - Chien, S. F.
AU - Chieng, David
AU - Chen, Samuel Y.C.
AU - Zarakovitis, Charilaos C.
AU - Lim, H. S.
AU - Xu, Y. H.
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - A recent study showcased the efficacy of Long Short-Term Memory (LSTM) in significantly reducing average indoor localization Root Mean Square Error (RMSE). Motivated by the superior performance of quantum algorithms, we explore Quantum LSTM (QLSTM) for indoor localization, leveraging a variational quantum circuit (VQC). QLSTM benefits from diverse gate sequences and increased variational parameters, enhancing learning capabilities. As QLSTM is a relatively recent concept, it is essential to conduct a comprehensive investigation into the impact of hyperparameters, including learning rate, the quantity of hidden layers, and the number of quantum neurons, to ascertain their influence on achieving the necessary RMSE during the training process. The results show that QLSTM is highly sensitive to the choice of optimizer and is capable of producing comparable low RMSE values with significantly fewer neurons than classical LSTM. In a scenario where a two-hidden-layer LSTM architecture is utilized, featuring 35 neurons in each layer, 6 input features, and generating 2 outputs, the LSTM configuration has a total of 15,892 parameters. In contrast, the QLSTM configuration is more streamlined, with only 7,562 parameters. Additionally, it is noteworthy that the RMSE for QLSTM is comparable to its classical counterpart, standing at 0.895 as opposed to 0.8705.
AB - A recent study showcased the efficacy of Long Short-Term Memory (LSTM) in significantly reducing average indoor localization Root Mean Square Error (RMSE). Motivated by the superior performance of quantum algorithms, we explore Quantum LSTM (QLSTM) for indoor localization, leveraging a variational quantum circuit (VQC). QLSTM benefits from diverse gate sequences and increased variational parameters, enhancing learning capabilities. As QLSTM is a relatively recent concept, it is essential to conduct a comprehensive investigation into the impact of hyperparameters, including learning rate, the quantity of hidden layers, and the number of quantum neurons, to ascertain their influence on achieving the necessary RMSE during the training process. The results show that QLSTM is highly sensitive to the choice of optimizer and is capable of producing comparable low RMSE values with significantly fewer neurons than classical LSTM. In a scenario where a two-hidden-layer LSTM architecture is utilized, featuring 35 neurons in each layer, 6 input features, and generating 2 outputs, the LSTM configuration has a total of 15,892 parameters. In contrast, the QLSTM configuration is more streamlined, with only 7,562 parameters. Additionally, it is noteworthy that the RMSE for QLSTM is comparable to its classical counterpart, standing at 0.895 as opposed to 0.8705.
KW - Recurrent neural network
KW - indoor localization
KW - long short-term memory
KW - received signal strength indicator
KW - variational quantum circuit
UR - http://www.scopus.com/inward/record.url?scp=85195390581&partnerID=8YFLogxK
U2 - 10.1109/ICASSP48485.2024.10447032
DO - 10.1109/ICASSP48485.2024.10447032
M3 - Conference contribution
AN - SCOPUS:85195390581
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 131
EP - 135
BT - 2024 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 49th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2024
Y2 - 14 April 2024 through 19 April 2024
ER -