• youtube

Archives for : Ideas

White noise: An idea to defend privat sphere against XKeyscore

One of Edward Snowdens most important leaks was about XKeyscore. A software which allows intelligence services to grab and analyse the whole internet traffic. NSA analysts can see mostly everything of the whole Internet activity by grabbing data directly from important Internet nodes. E-Mails, search queries and private messages. Nothing is safe from NSA and its partners.

In the last weeks many people were frightened about that issue as the data analysis works out of legal borders of consitutional law. No judicial orders were needed for grabbing and analysing that data. This way a basic principle of modern democracies was bypassed: Separation of powers between judiciary and executive.

The good thing: People started fighting against this attack touching civil rights by encrypting their Mails, Messages, using Tor and informing about cryptographie techniques like PGP for example.

But this is not enough: Many people are still not aware of the issues a surveillance society has or are not willing to defend themself against this kind of total control as it is sometimes still complicated and not easy enough to use cryptography and surf anonymously through the internet.

Another idea to defend civil rights and to show civil disobedience against these anti democratic programs like XKeyscore might be interfering Signals that XKeyscore just receives white noise:

The idea behind that is as follows:

NSA is using data mining techniques to filter out interesting stuff from Internet communication. They are interested in if people are communicating about evil stuff by filtering keywords, or if one is communicating with wrong and evil people. The side effect: They know almost everything one might send over the internet.  If we want to stop them spying out our private lifes, we have to disturb their eavesdropping techniques. This would be possible by sending lots of interfering signals all around the internet.

If lots of people start creating fake identities (for example mail and social network accounts) all over the internet and also every of this fake indentities is starting to communicate normal and also evil stuff to other fake identities, as well as to real people, maybe we can manage it to make XKeyscore absolutely useless. Lets set up mails, messages and search queries, the NSA might find interesting. If enough people are using and faking internet identities this way, the NSA will receive nothing but white noise with XKeyscore, as they cannot distuinguish between real and fake communication.

At this point you may note, that this would be a high manual effort for everyone, which want to be part of that protest. There is no need for that, if we can use intelligent botting techniques. If we can implement some softwares which are creating fake identities for themself and are communicating over the internet like real persons, one has just to install such a software on a home PC. And the best thing is: No one has to do something illegal. It is not illegal to create fake identities and it is also not illegal to let these fake identities communicate. So let us make XKeyscore receiving just white noise! Let us reclaim privat sphere!

Let’s discuss about that topic: Do you think that this is a good idea or is it complete bullshit? Would it be worth the effort to implement such bots or is it useless in your opinion? Or do you have better ideas? Spread the word, if you like it!

AR.Drone 2 flies with the power of your mind

Control an AR.Drone 2.0 with your Mind. Let it fly with cognitiv thoughts, facial expressions, Gyroscope or by Keyboard like a helicopter in GTA.
It’s magic!

MindDrone allows you to control an AR.Drone 2.0 from Parrot with your Keyboard or Emotiv EPOC. You can let it fly just with your thoughts. Connect to AR.Drones WLAN before you start the app. MindDrone includes four different control modi for your drone:

1. Keyboard:
You can control the drone with the keys Q, W, E, A, S, D, PageUp and PageDown. The keyboard control is very intuitive and works even more precise, than the Smartphone control with AR.FreeFlight. You played a game of the GTA series and flew a helicopter in that game? Then you are already familiar with the keyboard control. You can fly the drone in the same way as a GTA helicopter. But please keep in mind: This is reality!

2. Cognitiv Suite:
Use the power of your brain. EPOC reads your brainwaves and recognizes different cognitiv thoughts. Just imagine that you push something forward and the drone flies forward (or in any other direction you are thinking of). The drone moves in the same way, as the cube in EPOCs control panel. Of course you can configure this control mode and map your preferred cognitiv thoughts to the drone actions you like. You can also combine this control mode with Expressiv Suite and/or EPOCs Gyroscope.

3. Expressiv Suite:
Use facial expressions for sending special commands to your drone. For example Take off, Landing or Recording video. You can also move AR.Drone with your eyes. Example: Look to the left and your drone turns left. Look to the right and it turns right. Raise your eyebrows and the drone flies up. Lower it and it flies down. The expressiv commands are also fully configurable.

4. Gyroscope:
Move your head and the drone moves. Move your head left (or right) and it turns left (or right). Move your head forward (or backward) and the drone starts to fly forward (or backward).

If you want to use Cognitiv, Expressiv or Gyro control, ensure that EPOC Control Panel is started, before activating these control modes. The Keyboard control is always activated and overwrites other control modes.  If Cognitiv, Expressiv or Gyro control doesn’t work the way you want, you can interfere the drone movement with keyboard. So please make sure you are familiar with keyboard control, before activating other control modes. Cognitiv, Expressiv and Gyro control can be activated and configured via the configuration window.

Video is supported. Therefore FFMPEG library is needed. Due to licensing reasons, that library cannot included directly. You have to download FFMPEG from here. Unzip this file, go to the folder ffmpeg-1.2-win32-shared/bin and copy all files in this folder to the application folder of MindDrone. Then execute windows cmd console as administrator, go to application folder within command window and execute create-symlinks.cmd. After that you should see the picture of AR.Drone’s camera within MindDrone. If you have recorded a video and close the application, FFMPEG is executed automatically, to encode a videofile of your flight. Ensure that your windows user has write priviliges to MindDrone’s application folder.

AR.Drone 2.0, Microsoft Windows, WLAN, .NET Framework 4 or higher.

Ensure you are good enough in controlling the cube in EPOCs Control Panel, before using the Cognitiv drone control. Please check also the Expressiv recognition, before using Expressiv Drone control. When using Cognitiv, Expressiv or Gyro Drone control for the first time, you should fly your drone in a free wide area, to get familiar with the different control modi. Interfere unexpected drone behaviour with keyboard. Make sure that the MindDrone window keeps in focus, when the Drone is flying. Pressing keyboard keys will only be recognized, when MindDrone’s window is in focus.

You always fly at your own risk. Any liability for accidents with the drone excluded.

This app uses C# AR.Drone 2.0 controlling library by RUSLAN-B. Ruslans great work was important to make MindDrone possible.

MindDrone is available at EmoStore, the app store by Emotiv:

Use your face as mouse control (with Kinect)

Replace your mouse with your face. Control the cursor just by moving your head. Click by winking your eyes, scroll by raising and lowering your eyebrows. All of that is possible now with ExpressionMouse Kinect.

During the usage of our KinectMouse, we figured out that it is very exhausting for your arm, if you have to control the mouse cursor all the time with your hand. So we were looking for a more easier method to control the mouse cursor with Kinect. All you have to do is moving your head and using some facial expressions for certain mouse actions.

See how this new kind of mouse control works:

How does our application work?

Cursor Moving

It is really easy: Just move your head to control the cursor. Make sure that Kinect can see your face as well as your chest. Sometimes the inital recognition is better when you are waving. It is normal that Kinect needs a few seconds to identify your face correctly. In contrast to KinectMouse, ExpressionMouse Kinect is more precise when you are more close to the sensor (but not too close) as the sensor has a more detailed view on your face this way. One meter should be a sufficient distance.

Left Click

Just wink with your right eye about a second. At this point you may ask why you have to use your right eye for a left click and not your left eye. During testing we figured out that the sensor detected our right eyes much more accurate than our left eyes. So we decided to swap left and right. As left clicks are much more frequent than right clicks, we think it is a good idea to use the most sensitive wink for the left click.

Right Click

Just wink with your left eye about a second.

Double Click

Wink with both of your eyes at the same time. Then a double click will be executed.


Raise your eyebrows for scrolling up and lower it for scrolling down.

Drag & Drop

Open your mouth for starting drag & drop. Move your head to move the cursor and keep your mouth open. For dropping, just close your mouth.

Find the correct settings for yourself

Every face is different. It could be that the preselected settings in ExpressionMouse Kinect are not optimal for you. Just play a bit with the thresholds until you are satisfied.

  • ClickDelay: Timespan in frames (Normally Kinect works with 30 fps) which have to elapse between two mouse actions.
  • Headrotation Smoothing Filter Values: Frameweights for calculating weighted average of your head rotation. Used for smoothing the cursor motion. If you enter the following the cursor will become more precise, but it will also a bit more delayed: “2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1”
  • Percentage of horizontal edge Pixels: Used for differentiating between open and closed eye. A higher value means that the closed eye detection is less sensitive. A lower value makes it more sensitive.
  • Used frames for closed eye detection: More frames increases accuracy in closed eye detection, but it also increases the timespan between closing the eye and the execution of the mouseclick.
  • Eye closed filter threshold: Used for differentiating between open and closed eye. A higher value means that the closed eye detection is less sensitive. A lower value makes it more sensitive.
  • Double click second eye threshold: Threshold for differentiating between a normal click and a double click. If doubleclicks are not recognized correctly, you should decrease this value.  If normal clicks are recognized as doubleclicks, you should increase this value.
  • Brow raiser start threshold:  Threshold for raising your brow. Decrease value, if raising your brow is not recognized. If your computer is scrolling up, even if you are not raising your brow, increase this value.
  • Brow lowerer start threshold: Same as Brow raiser start threshold, but for lowering your brow and scrolling down.
  • Mouth open start threshold: Threshold for opening your mouth (executing MouseDown Event). Increase if opening your mouth is not recognized correctly. Decrease if MouseDown is executed, even if your mouth is closed.
  • Mouth open confirmation: Decrease this value, if MouseUp is executed, even if your mouth is still open.
  • Mouth open end threshold: Threshold for closing your mouth (executing MouseUp Event). Increase if closing your mouth is not recognized correctly. Decrease if MouseUp is executed, even if your mouth is still open.
  • Scroll multiplier up: A higher value means that scrolling up is faster.
  • Scroll multiplier down: A higher value means that scrolling down is faster.
  • Head to Screen relation X – Width: Sensitivity of the mouse cursor in horizontal direction. A higher value means less sensitivity.
  • Head to Screen relation Y – Height: Sensitivity of the mouse cursor in vertical direction. A higher value means less sensitivity.

The free version only works with Kinect for Windows sensor. The PRO version saves your changed settings automatically and also works with Xbox 360 Kinect sensor.

I’m always happy to get some feedback. So please comment and let me know, if you are satisfied with this app. If you have any questions, feel free to ask.

Update (20.01.2017):

As the stores for the PRO version are no longer available since a while a go and as I don’t have much time currently to maintain this project, I decided to make this app Open Source. You get the “PRO version” via the download link, now.
Check out the Source here:
ExpressionMouse on GitHub

  1. I’ve uploaded a new version 1.0.2 of ExpressionMouse Kinect with several minor bugfixes.
  2. There is a PRO version available, now. The PRO version saves your settings automatically. It supports also Kinect sensor for Xbox 360 as well as Kinect for Windows sensor. You can download it via the Motionfair Store.
  3. Due to a Trademark complaint this app is called ExpressionMouse by now.
  4. Users of PRO version: If you use Kinect for Xbox sensor, you have to install the Kinect SDK. It is not sufficient to run ExpressionMouse with Kinect runtime and a Xbox Kinect sensor.