• youtube

Archives for : Technology

Checking security of closed source Crypto-Apps by reverse engineering

One important rule in modern cryptography is Kerckhoffs’s principle: It tells that a cryptosystem has to stay secure, even if every detail of this system except the secret key is publicly announced.

For apps using cryptography it is important that everyone can check if the promises of the developers are true and the codes of the apps are really secure. So normally it is needed that these apps are open source. Then everyone is able to have a look into the source code and cryptographers can check this code for security issues.

There are some promising apps using cryptography, which seem to do a quite good job on the one hand. On the other hand the security of this apps cannot be proved, as they are closed source. One example is the app Threema. It’s an alternative to the well known WhatsApp. Threema uses end-to-end encryption for securing your Smartphone-Chats on Android and iOS. The developer of this app showed a way how it is possible to validate the used encryption by a program written in C:

It is also possible to analyze network traffic by tools like Wireshark or Shark for Root on Android systems. However, the analyization of network traffic and the validation tool alone is not a sufficient criterion. An analyzation of the source code is needed for a meaningful security check. However, although the source code of Threema is not published by the developer, principially it is possible to analyze the code of the android app. The magic word is called Reverse Engineering.

Be warned: The Threema license agreement explicitly prohibits reverse engineering of the app:
So I’m not recommending to do this! It is important to point out that I’m just showing an easy way how reverse engineering could be done in general with any Android-APK using Windows. Do the following steps only if you have an explicit permission for reverse engineering by the regarding app developer.

First you need the following things:
1. The .apk file of the app you want to analyze.

2. A tool called dex2jar. You can download it here:

3. Another tool called JD-GUI. Available here:

4. Unzip dex2jar.

5. Set an entry in your path environment variable to the unzipped dex2jar-folder.

6. Open the .apk-File with a Zip program and Unzip it’s contents. The zipped .apk-File contains a file ending to .dex. This file contains the byte code for Androids Dalvik VM.

7. Open the console and go to the folder of the .dex-File.

8. Enter “dex2jar filename.dex” and replace “filename.dex” with the real name of the file.

9. This will extract the java bytecode from the .dex file and create a .jar-file. Now you have all the .class files with Java Bytecode within this generated .jar-file.

10. This Bytecode can be converted to Java-Code with JD-GUI. So go to JD-GUI and open the created .jar file with it.

11. Voila. Now You can see the extracted Java code.

Unfortunately this method has still some drawbacks:
1. For most apps this is illegal. So again: Do the steps above only for apps where reverse engineering is not forbidden!

2. The recovered original java code is not very nice to read as most of the class-, package and variable-names cannot fully be restored. Also the Java compiler has optimized parts of the code for speed so for example loops could be unfolded and some other things are done. Finally the code looks not as nice as the original. However the logical structure of the code is correct and with time and commitment it is probably possible to do a full source code analysis.

3. The reverse engineered code is only valid for the APK-version of an app but not for a version which might be distributed through Google-Play-Store. It is possible that the version distributed through the Play-Store is not the same like the APK-version. So it might be that the Play-Store-Version could have additional backdoors.

Finally it can be said that already today it seems to be possible to check the security of apps like Threema by analyzing the source code of the client app. Hopefully some crypto experts can analyze the source code, maybe with help of reverse engineering (of course only with permission by Kasper Systems) and can check if they find any security drawbacks in Threema’s code.

AR.Drone 2 flies with the power of your mind

Control an AR.Drone 2.0 with your Mind. Let it fly with cognitiv thoughts, facial expressions, Gyroscope or by Keyboard like a helicopter in GTA.
It’s magic!

MindDrone allows you to control an AR.Drone 2.0 from Parrot with your Keyboard or Emotiv EPOC. You can let it fly just with your thoughts. Connect to AR.Drones WLAN before you start the app. MindDrone includes four different control modi for your drone:

1. Keyboard:
You can control the drone with the keys Q, W, E, A, S, D, PageUp and PageDown. The keyboard control is very intuitive and works even more precise, than the Smartphone control with AR.FreeFlight. You played a game of the GTA series and flew a helicopter in that game? Then you are already familiar with the keyboard control. You can fly the drone in the same way as a GTA helicopter. But please keep in mind: This is reality!

2. Cognitiv Suite:
Use the power of your brain. EPOC reads your brainwaves and recognizes different cognitiv thoughts. Just imagine that you push something forward and the drone flies forward (or in any other direction you are thinking of). The drone moves in the same way, as the cube in EPOCs control panel. Of course you can configure this control mode and map your preferred cognitiv thoughts to the drone actions you like. You can also combine this control mode with Expressiv Suite and/or EPOCs Gyroscope.

3. Expressiv Suite:
Use facial expressions for sending special commands to your drone. For example Take off, Landing or Recording video. You can also move AR.Drone with your eyes. Example: Look to the left and your drone turns left. Look to the right and it turns right. Raise your eyebrows and the drone flies up. Lower it and it flies down. The expressiv commands are also fully configurable.

4. Gyroscope:
Move your head and the drone moves. Move your head left (or right) and it turns left (or right). Move your head forward (or backward) and the drone starts to fly forward (or backward).

If you want to use Cognitiv, Expressiv or Gyro control, ensure that EPOC Control Panel is started, before activating these control modes. The Keyboard control is always activated and overwrites other control modes.  If Cognitiv, Expressiv or Gyro control doesn’t work the way you want, you can interfere the drone movement with keyboard. So please make sure you are familiar with keyboard control, before activating other control modes. Cognitiv, Expressiv and Gyro control can be activated and configured via the configuration window.

Video is supported. Therefore FFMPEG library is needed. Due to licensing reasons, that library cannot included directly. You have to download FFMPEG from here. Unzip this file, go to the folder ffmpeg-1.2-win32-shared/bin and copy all files in this folder to the application folder of MindDrone. Then execute windows cmd console as administrator, go to application folder within command window and execute create-symlinks.cmd. After that you should see the picture of AR.Drone’s camera within MindDrone. If you have recorded a video and close the application, FFMPEG is executed automatically, to encode a videofile of your flight. Ensure that your windows user has write priviliges to MindDrone’s application folder.

AR.Drone 2.0, Microsoft Windows, WLAN, .NET Framework 4 or higher.

Ensure you are good enough in controlling the cube in EPOCs Control Panel, before using the Cognitiv drone control. Please check also the Expressiv recognition, before using Expressiv Drone control. When using Cognitiv, Expressiv or Gyro Drone control for the first time, you should fly your drone in a free wide area, to get familiar with the different control modi. Interfere unexpected drone behaviour with keyboard. Make sure that the MindDrone window keeps in focus, when the Drone is flying. Pressing keyboard keys will only be recognized, when MindDrone’s window is in focus.

You always fly at your own risk. Any liability for accidents with the drone excluded.

This app uses C# AR.Drone 2.0 controlling library by RUSLAN-B. Ruslans great work was important to make MindDrone possible.

MindDrone is available at EmoStore, the app store by Emotiv:

App update: Move the mouse cursor just with your thoughts

A month ago we released an application which was able to execute mouse actions like clicking or scrolling with the help of your mind using Emotiv EPOC. It used a Gyroscope to control the mouse cursor. Now we have released a new major version with a significant new feature:

If you want, you can do everything, really everything, just with your mind and your thoughts. No Gyroscope and no mouse is needed anymore. Besides the Expressiv Data which EPOC is delivering (explained in our old article), EPOC also recognizes cognitiv thoughts which can be mapped to four different directions. At the beginning this is not really easy: You have really to concentrate and train a lot so that EPOC can distinguish the different direction thoughts in your mind correctly.

But if you have done that: You can move the mouse cursor just by your thoughts.

To be honest: Moving the cursor is still easier with the gyroscope, but if one is really good at controlling his cognitiv thoughts with EPOC or is not able to move his head: Our updated Neuro Mousecontrol may help.

Neuro Mousecontrol is available at EmoStore, the app store by Emotiv:

Use your brain as mouse control (with EPOC by Emotiv)

Can you imagine that there is an easier way to control the mouse, than with your head? The answer could be: With your mind.
Electroencephalography (EEG) was already established in 1924. In the past it was used mainly for medical reasons. But now you can also use this technology, to control your computer.

Emotiv EPOC

Emotiv has developed a Neuroheadset called EPOC. It reads your brainwaves and on this ground it is able to detect your current feelings like boredom, frustration, meditation and excitement. Your mind sends also certain signal patterns when your face is showing facial expressions. So EPOC can detect facial expressions like smiling, lowering and raising your eyebrows, clenching your teeths or blinking your eyes. This is quite similar to the facial expression detection, which is provided by Kinect for Windows SDK. Furthermore EPOC has a Gyroscope, which recognizes your head movement very accurate. So we developed a version of our mouse control especially for EPOC.

Neuro Mousecontrol

You can replace your mouse with Emotiv EPOC, the revolutionary brain computer interface. Within Emotiv Control Panel there is already a Mouse Emulator which uses the EPOC Gyroscope to control the cursor of your mouse. EmoKey, an app developed by Emotiv, provides basic functionality for clicking by using EPOC, but this method has several disadvantages:

  • In EmoKey clicks are sometimes not executed correctly in special kinds of application windows.
  • It is not possible to send a scroll command with EmoKey.

Neuro Mousecontrol sends mouseclicks directly to the operating system, independent from the currently active application window. Also it is possible to scroll or drag & drop, which is not possible with EmoKey.

Our app is the easiest way to replace your mouse with EPOC.

Neuro Mousecontrol is available at EmoStore, the app store by Emotiv:

Use your face as mouse control (with Kinect)

Replace your mouse with your face. Control the cursor just by moving your head. Click by winking your eyes, scroll by raising and lowering your eyebrows. All of that is possible now with ExpressionMouse Kinect.

During the usage of our KinectMouse, we figured out that it is very exhausting for your arm, if you have to control the mouse cursor all the time with your hand. So we were looking for a more easier method to control the mouse cursor with Kinect. All you have to do is moving your head and using some facial expressions for certain mouse actions.

See how this new kind of mouse control works:

How does our application work?

Cursor Moving

It is really easy: Just move your head to control the cursor. Make sure that Kinect can see your face as well as your chest. Sometimes the inital recognition is better when you are waving. It is normal that Kinect needs a few seconds to identify your face correctly. In contrast to KinectMouse, ExpressionMouse Kinect is more precise when you are more close to the sensor (but not too close) as the sensor has a more detailed view on your face this way. One meter should be a sufficient distance.

Left Click

Just wink with your right eye about a second. At this point you may ask why you have to use your right eye for a left click and not your left eye. During testing we figured out that the sensor detected our right eyes much more accurate than our left eyes. So we decided to swap left and right. As left clicks are much more frequent than right clicks, we think it is a good idea to use the most sensitive wink for the left click.

Right Click

Just wink with your left eye about a second.

Double Click

Wink with both of your eyes at the same time. Then a double click will be executed.


Raise your eyebrows for scrolling up and lower it for scrolling down.

Drag & Drop

Open your mouth for starting drag & drop. Move your head to move the cursor and keep your mouth open. For dropping, just close your mouth.

Find the correct settings for yourself

Every face is different. It could be that the preselected settings in ExpressionMouse Kinect are not optimal for you. Just play a bit with the thresholds until you are satisfied.

  • ClickDelay: Timespan in frames (Normally Kinect works with 30 fps) which have to elapse between two mouse actions.
  • Headrotation Smoothing Filter Values: Frameweights for calculating weighted average of your head rotation. Used for smoothing the cursor motion. If you enter the following the cursor will become more precise, but it will also a bit more delayed: “2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1”
  • Percentage of horizontal edge Pixels: Used for differentiating between open and closed eye. A higher value means that the closed eye detection is less sensitive. A lower value makes it more sensitive.
  • Used frames for closed eye detection: More frames increases accuracy in closed eye detection, but it also increases the timespan between closing the eye and the execution of the mouseclick.
  • Eye closed filter threshold: Used for differentiating between open and closed eye. A higher value means that the closed eye detection is less sensitive. A lower value makes it more sensitive.
  • Double click second eye threshold: Threshold for differentiating between a normal click and a double click. If doubleclicks are not recognized correctly, you should decrease this value.  If normal clicks are recognized as doubleclicks, you should increase this value.
  • Brow raiser start threshold:  Threshold for raising your brow. Decrease value, if raising your brow is not recognized. If your computer is scrolling up, even if you are not raising your brow, increase this value.
  • Brow lowerer start threshold: Same as Brow raiser start threshold, but for lowering your brow and scrolling down.
  • Mouth open start threshold: Threshold for opening your mouth (executing MouseDown Event). Increase if opening your mouth is not recognized correctly. Decrease if MouseDown is executed, even if your mouth is closed.
  • Mouth open confirmation: Decrease this value, if MouseUp is executed, even if your mouth is still open.
  • Mouth open end threshold: Threshold for closing your mouth (executing MouseUp Event). Increase if closing your mouth is not recognized correctly. Decrease if MouseUp is executed, even if your mouth is still open.
  • Scroll multiplier up: A higher value means that scrolling up is faster.
  • Scroll multiplier down: A higher value means that scrolling down is faster.
  • Head to Screen relation X – Width: Sensitivity of the mouse cursor in horizontal direction. A higher value means less sensitivity.
  • Head to Screen relation Y – Height: Sensitivity of the mouse cursor in vertical direction. A higher value means less sensitivity.

The free version only works with Kinect for Windows sensor. The PRO version saves your changed settings automatically and also works with Xbox 360 Kinect sensor.

I’m always happy to get some feedback. So please comment and let me know, if you are satisfied with this app. If you have any questions, feel free to ask.

Update (20.01.2017):

As the stores for the PRO version are no longer available since a while a go and as I don’t have much time currently to maintain this project, I decided to make this app Open Source. You get the “PRO version” via the download link, now.
Check out the Source here:
ExpressionMouse on GitHub

  1. I’ve uploaded a new version 1.0.2 of ExpressionMouse Kinect with several minor bugfixes.
  2. There is a PRO version available, now. The PRO version saves your settings automatically. It supports also Kinect sensor for Xbox 360 as well as Kinect for Windows sensor. You can download it via the Motionfair Store.
  3. Due to a Trademark complaint this app is called ExpressionMouse by now.
  4. Users of PRO version: If you use Kinect for Xbox sensor, you have to install the Kinect SDK. It is not sufficient to run ExpressionMouse with Kinect runtime and a Xbox Kinect sensor.

Use your hand instead a mouse (with Kinect for Windows)

Within an university programming project we tried out several methods to control the mouse cursor with Kinect for Windows. One of the results is a small application, which lets you control the Windows mouse cursor with your hands. Generally the cursor moves like the Kinect cursor on Xbox, but in our opinion there are several disadvantages on the Xbox Kinect mouse control:

  1. On Xbox you need to hold the cursor for about a second over a tile if you want to click on it. This is an unnecessary delay. So we were looking for a clicking method, which is as fast as a click on a PC mouse.
  2. With the Xbox Kinect control you only have the opportunity to make a standard click. There is neither a possibility for a right click nor a possibility for a double click and also no possibilty for Drag & Drop. But you need all of these special mouse actions, when you are working with your PC.

How does our application work?

Cursor Moving

Just move your hand for controlling the cursor. The recognition of your hand is a bit insensitive when you are too close to the sensor. If the cursor is too much jumping, just go one or two steps back. When you start the application it takes a moment for the sensor to recognize your hand. Make sure that Kinect can see your face as well as your chest and both of your hands. Sometimes the inital recognition is better when you are waving.

Left Click

As Kinect recognizes the distance from your hand to the sensor, clicking is possible by moving your hand just a bit forward and then directly backwards (see picture below). Just make a little motion, the sensor should recognize that and executes the click on the position where the mouse cursor was, when you started the motion.


Double Click

Double clicking works the same way like normal clicking. The difference to a normal click is that you have to move your hand just a bit more forward and backward.

Right Click

For a right click the motion is quite similar to the motion for the left click. Use your left hand and move it just a bit backward and then directly forward (instead forward and backward as at the left click).
It’s important that you are working with your left hand when using right click, as there is an incompatibility with the distinction between right click and Drag & Drop.

Drag & Drop

Just move your right hand a bit back. This motion will execute a Mouse Down Event. Then you can move the cursor. When you move your hand back forward, the Mouse Up Event is executed. Drag & Drop ends at this point.

Drag and Drop