Functional Testing

Testing the MotionStack with pre-recorded sensor data from actual devices.

As part of the quality assurance process, a functional testing suite is provided to test each individual API of the MotionStack. For each API an appropriate gesture is defined, where the exact behavior of the API is known for that gesture. The raw sensor data for the gesture is recorded on a variety of devices using Adtile's Motion-VCR. Each recording is saved as an individual VCR cassette. Then the data from each cassette is feed as input into the API and the API's response is compared to the expected API behavior for that gesture. If the API's response exactly matches the expected response, then the test is considered successful.

Functional testing has played a pivotal role during development to test modifications made to existing features and creating new features. In addition, functional testing is used for verifying local builds of the MotionStack is working correctly across platforms.

The functional testing suite is installed along side the MotionStack in node modules with npm install. To run the pre-existing functional tests, run npm test in the root directory of MotionStack.

Motion VCR

Adtile Motion VCR

Adtile's Motion VCR records and plays-back sensor data cassettes. It is installed as part of the node modules inside the MotionStack direction. It runs as a separate server and can be sourced in any web or native application.

Recording High Quality Sensor Data

Procedure to Setup the Recording Application

  1. Open a terminal, navigate to the demos git repository, and start a local grunt server. We will be using the demo named charts-vcr.
    cd adtile/demos
    grunt w:charts-vcr
  2. Open another terminal, navigate to the motion-vcr git repository, and start the motion-vcr local server. (Note the http address given here after starting the motion-vcr server to be used in step 4.)
    cd adtile/motion-vcr
    bin/www
  3. Open another terminal, navigate to the motionstack git repository, and start a local grunt server. This is to ensure an updated version of motion.min.js is generated and copied to the demos/lib/vlatest directory. Later if for some reason the vlatest version is out of date, try saving a file in the server’s watch list to regenerate motion.min.js.
    cd adtile/motionstack
    grunt
  4. Copy the http address given in the terminal from the motion-vcr server (given by step 2). In this case it is 192.168.1.17. Open the file demos/charts-vcr/index.html in a text editor. Replace the existing http address in this file with the one copied. There are two places in the file where this must be done. Save the file when done.

  5. Open a web-browser on the mobile device. Type in the address and port number to the demos local grunt server. (In this case it is 192.168.1.17:8000). After the page loads, click on demos, then scroll down and click on charts-vcr.

  6. To begin recording press the start button. To stop recording press the stop button. The data is saved in a new txt file in directory motion-vcr/cassettes. The file name will be a large integer representing the current time in milliseconds.

Procedure to Record High Quality Sensor Data

Before starting the charts-vcr demo consider the type of motion desired to be recorded. What feature are you testing for? Is it a simple motion or a more complex motion? Can you fully describe the exact solution for how the motion should trigger the feature being tested? Remember, potential cassettes are recorded only once; however, they could be used many times in the future. This is why extra steps must be taken to guarantee the appropriate motions are captured without any erroneous data artifacts. Although seemingly unnecessary, these steps will help maintain high quality recordings for peer use. Recorded data can be used for automated functional testing, feature optimization, machine learning, or testing device functionality when the device is not available, etc.

Laboratory testing should be as objective as possible. This particular type of lab testing has an inherent subjective side due to each user’s individual motion traits and feel. Using a robotically exact motion will enable the lab worker to have greater knowledge and repeatability of the motion. Where as collecting actual user data, together with their experience response, will enable the lab worker to map exact motion measurements to human perceived motion measurements. More on this to come.

  1. Be aware of the initial position of the device before starting the charts-vcr demo.
  2. Hold the device steady and gently press the start and stop buttons. This is to prevent unwanted bumps in the recorded data.
  3. Allow some time to pass before pressing the start button to allow the sensors to fire- up.
  4. After pressing the start button, wait about half a second before beginning the desired motion. This will guarantee the beginning of the motion is fully captured in the cassette data.
  5. Move the device in a controlled fashion. Steady mechanical motions are optimal for testing basic feature behavior. More natural motions are desired for designing features for robust user experiences.
  6. After completing the motion, wait half a second before pressing the stop button. This will guarantee the end of the motion is fully captured in the cassette data.
  7. Start by capturing single simple motions per cassette. Then move on to more complex motions.
  8. Rename the cassette with an appropriate file name. The file name should distinguish the device operating system and version, device model name, browser being used, the feature being tested, and a compact description of the motion.

    a. General naming convention. <os>-<device>-<browser>-<feature>-<motion>.txt

    b. Example, ios7.1.2-iphone4-safari-flick-Right5-medium.txt. This example represents 5 flicks to the right at “medium speed”, to be used to test the horizontal tilt feature.

  9. Document the exact feature behavior that should be triggered for each type of motion.

    a. Example from above, ios7.1.2-iphone4-safari-flick-Right5-medium.txt. A Javascript array describing the exact output of the tilt feature is: [4,4,4,4,4] where DIRECTION_RIGHT=4 represents the returned parameter value {direction: DIRECTION_RIGHT }.