📜 ⬆️ ⬇️

Android, Rx and Kotlin or how to make the claw of Lego shrink. Part 1

Hello, Habr lovers! By a lucky chance, in August 2018 I was lucky to start working with my comrade ( kirillskiy ) on a project of amazing interestingness. And so, in the daytime we were ordinary programmers, and at night we were programmers who struggle with motion recognition issues for people with functional limitations of their limbs, of course, healthy people could use this, using similar technology in a variety of ways.

In this article , Kirill tells in general about the project, but I will tell you in more detail and touch on the topic of the android in it.
I will tell you first about the project in its entirety, what we have invented and how we wanted to implement it:

1) EMG (Electromyography - recording of the electrical activity of the muscles) was chosen as a way to obtain data (oh yes, there will be a lot of data). For the first time this method was applied in 1907, so we walked the beaten track.

2) Found an 8-channel EMG sensor working via bluetooth (even having its own API, which in the end turned out to be absolutely useless, because I had to connect independently as a BT device. Thanks at least they wrote the specification)

3) We decided that everything will work like this:


4) Item Android. I am an Android developer - and it was a sin not to use it. The Android does this:


5) Raspberry PI 3B. it was on raspberries that we put Android Things, and then we lifted the BT server on it, which receives messages from the Android device and moves the corresponding motors driving the super-claw from LEGO.

6) Server. Deployed by Docker locally on a computer. Accepts the data sent by your device, trains the neural network, returns the model.

Part number 1. Android. This time we will consider the under hood space of the project, concerning Android before the data is sent to the server.

It is called NUKLEOS (https://github.com/cyber-punk-me/nukleos)
Stack:

- Kotlin
- MVP
- Dagger2
- Retrofit2
- RxKotlin, RxAndroid

for raspberry:

-Android things

At work, they do not let me play with architecture, and then finally there is an opportunity to play around with an old toy called MVP.

The application consists of one BottomNavigation Style Activation and 4 fragments:
The first one is “List of all available BT devices”

We chose the 8 channel BT sensor, which had its own API for working with BT. Unfortunately, api turned out to be absolutely useless, for he immediately suggested defining one of 6 (like) movement types, but the recognition accuracy was 80% - and this is no good. Well, we needed actual data. The value of changes in bioelectric potentials that occur in the muscles of a person when muscle fibers are excited. And for this it was necessary to work with this sensor directly. The creators have left a description of the protocol of working with him, so it took not so long to pick one. I can describe an example of working with bare BT devices in a separate article, if it is interesting, but in brief it looks like this:

class BluetoothConnector(val context: Context) { private val mBTLowEnergyScanner by lazy { (context.getSystemService(Activity.BLUETOOTH_SERVICE) as BluetoothManager) .adapter.bluetoothLeScanner } private var mBluetoothScanCallback: BluetoothScanCallback? = null // scan. fun startBluetoothScan(serviceUUID: UUID?) = Flowable.create<BluetoothDevice>({ mBluetoothScanCallback = BluetoothScanCallback(it) if (serviceUUID == null) { mBTLowEnergyScanner.startScan(mBluetoothScanCallback) } else { mBTLowEnergyScanner.startScan( arrayListOf(ScanFilter.Builder().setServiceUuid(ParcelUuid(serviceUUID)).build()), ScanSettings.Builder().setScanMode(ScanSettings.SCAN_MODE_LOW_LATENCY).build(), mBluetoothScanCallback) } }, BackpressureStrategy.BUFFER).apply { doOnCancel { mBTLowEnergyScanner.stopScan(mBluetoothScanCallback) } } // scan with timeout fun startBluetoothScan(interval: Long, timeUnit: TimeUnit, serviceUUID: UUID? = null) = startBluetoothScan(serviceUUID).takeUntil(Flowable.timer(interval, timeUnit)) inner class BluetoothScanCallback(private val emitter: FlowableEmitter<BluetoothDevice>) : ScanCallback() { override fun onScanResult(callbackType: Int, result: ScanResult?) { super.onScanResult(callbackType, result) result?.let { it.device.apply { emitter.onNext(this) } } } override fun onScanFailed(errorCode: Int) { super.onScanFailed(errorCode) emitter.onError(RuntimeException()) } } } 

Carefully wrap the standard BT service in the RX and get less pain.

Next, we launch the scan, and thanks to rx on the subscription we form a list of all devices, cramming them into RecyclerView:

 mFindSubscription = mFindFlowable ?.subscribeOn(Schedulers.io()) ?.observeOn(AndroidSchedulers.mainThread()) ?.subscribe({ if (it !in mBluetoothStuffManager.foundBTDevicesList) { addSensorToList(SensorStuff(it.name, it.address)) mBluetoothStuffManager.foundBTDevicesList.add(it) } }, { hideFindLoader() showFindError() if (mBluetoothStuffManager.foundBTDevicesList.isEmpty()) { showEmptyListText() } }, { hideFindLoader() showFindSuccess() if (mBluetoothStuffManager.foundBTDevicesList.isEmpty()) { showEmptyListText() } }) 

Select one of the devices, select it, and go to the next screen:
"Sensor Settings"

We connect to it and start streaming the sensor data using the commands we prepared in advance. Fortunately, the protocol of work with this device by the creators of the sensor is described:

 object CommandList { //Stop the streaming fun stopStreaming(): Command { val command_data = 0x01.toByte() val payload_data = 3.toByte() val emg_mode = 0x00.toByte() val imu_mode = 0x00.toByte() val class_mode = 0x00.toByte() return byteArrayOf(command_data, payload_data, emg_mode, imu_mode, class_mode) } // Start streaming (with filter) fun emgFilteredOnly(): Command { val command_data = 0x01.toByte() val payload_data = 3.toByte() val emg_mode = 0x02.toByte() val imu_mode = 0x00.toByte() val class_mode = 0x00.toByte() return byteArrayOf(command_data, payload_data, emg_mode, imu_mode, class_mode) } ..... 

Working with the device is also carefully wrapped in rx to work without pain.

Sensors return ByteArrei naturally, and it was necessary to gash the converter, the frequency of operation of the sensors is 200 Hz ... if it is interesting, I can describe in detail (well, or look at the code), but in the end we work with a sufficiently large amount of data this way:

1 - We need to draw the curves of each of the sensors. Of course, that drawing ABSOLUTELY all the data makes no sense, because on a mobile device it makes no sense for the eye to consider 200 changes per second on each sensor. Therefore, we will not take everything.

2 - We need to work with the entire volume of data, if it is a process of learning or recognition.

for these needs, the RX is perfect with all its filters.

Charts had to make their own. Who cares - see PowerfullChartsView in the views folder.

And now a little video:


In the video you will see how Cyril works with the system as a whole. The video is working with the model. But the model is on the server. In the future, it will of course be on the device, which will significantly speed up the response)

Write, what aspects are interesting, what to tell in more detail. Naturally, we are working on the project and are open to your suggestions.

All project on github here

Source: https://habr.com/ru/post/438034/