Machine Learning for smart prosthetic control.

When I joined IIT-D, I had a brief stint in the neuro-mechanics research lab at the centre for biomedical engineering (CBME). Here, I worked extensively on gait analysis and prosthetic development for trans-femoral amputees. I made odd circuits that controlled the actuators of artificial knees, collected amputee gait data; to a considerable depth – learnt everything on the neuro-mechanical aspects of human gait and among other things, I interacted with several amputees to understand their daily experience of the world.

About 30 million amputees are residing in low income countries [1,2]. Among those, only 5% to 15% are capable of affording prostheses [3]. Although the popular state-of-the art prostheses used in developed countries offer several advanced features, their application is hindered in low income, developing countries primarily due to high cost, different functional demands and cultural issues. It is a terrible thing to have lost a limb. Other than the stigma and physical limitations, there are thousands of mental challenges that one has to constantly meet.

To tackle some of these issues in India, a group of scientists at CBME have developed a cost effective prostheses [4]. Efforts are on to make it “smart”. With the motivation to better control the prosthetic limb, I was involved in this effort wherein I helped formally analyse gait patterns and extract insight.

A new instrument was designed for this study with different sensors measuring different things (deliberate sentence!). Of the many different things measured, the foot-to-ground angle (FGA) was one. This was accompanied with gait event markers that indicated the following : toe-off and heel strike that occurs while walking.

Fig 1. Notice the toe-off (at the beginning) and the heel-strike(at the end) in the gait cycle.

This a graphical depiction of the FGA signal and the gait events I am talking about:

Fig 2. Segmentation of swing and stance phase of the FGA gait cycle.

Here is the question we posed – Can we design an algorithm that understands the type of terrain (One of Ramp Ascent (RA), Ramp Descent (RD) and Level Ground Walking (LW)) a person is walking on just by looking at the FGA signal? This would really help us actuate the knee motors better because different types of terrains require different amount of knee rotation (Obviously, haha!)

This is how each cycle of the FGA signal looks like for different terrains:

Fig 3. Foot-to-ground angle corresponding to different locomotion terrains across multiple gait cycle for one representative participant. RD (solid – blue); LW (dashed — red); RA (dotted …. green).

After hours of tedious data curation, I got around to finally doing some machine learning on the signals. I applied a basic 1-D convolution neural network (because the field of reception has a single dimension – time) and checked if it could do a good job at classifying the terrains of locomotion. Without elaborating much about the model itself or its mathematics, I have only shown class activation maps for (each sample) to better understand the findings.

In the above mosaic, the class activation maps for each locomotion mode has been shown. The segments of signals in the boxes represent the discriminative samples that help in decision making while the model is classifying them.

The average classification accuracy obtained for able-bodied participants and amputee is 79.57% ± 20.32% and 73.06% ±12.70%, respectively using standard support vector machine methods, whereas it is 83.45% ± 14.50% and 80% ± 12.15% respectively using the 1-D CNN formulation.

Once the prosthetic device understands the type of terrain a person is walking on, the knowledge can be used in a feedback loop to actuate the motors in knee (or even in the ankles) accordingly. This will considerably reduce the effort of the amputee as well as allow them to walk comfortably for a longer time duration at a stretch. However, this is still under development.

Read the whole paper at: https://www.sciencedirect.com/science/article/abs/pii/S026322412030988X

References:
[1]. WHO (World Health Organization). World report on disability 2011. Am. J. Phys. Med. Rehabil. Assoc. Acad. Physiatr. 201191, 549. 
[2]. Andrysek, J. Lower-Limb Prosthetic Technologies in the Developing World: A Review of Literature from 1994–2010. Prosthet. Orthot. Int. 201034, 378–398.
[3]. Pearlman, J.; Cooper, R.; Krizack, M.; Lindsley, A.; Wu, Y.; Reisinger, K.; Armstrong, W.; Casanova, H.; Chhabra, H.; Noon, J. Lower-limb prostheses and wheelchairs in low-income countries: An Overview. IEEE Eng. Med. Biol. Mag. 200827, 12–22.
[4]. Pandit, S., Godiyal, A. K., Vimal, A. K., Singh, U., Joshi, D., & Kalyanasundaram, D. (2018). An affordable insole-sensor-based trans-femoral prosthesis for normal gait. Sensors18(3), 706.

1 thought on “Machine Learning for smart prosthetic control.

Comments are closed.