iPhone Fireflies Look Awesome

Whatever your feelings are about mobile device tracking, the fact is, it’s a reality we live with. Even before the iPhone location data hoopla hit the wires, mobile devices, particularly with ph...
iPhone Fireflies Look Awesome
Written by
  • Whatever your feelings are about mobile device tracking, the fact is, it’s a reality we live with. Even before the iPhone location data hoopla hit the wires, mobile devices, particularly with phones, were known to be easy to track. Every time your phone connects to another tower, the phone’s position is recorded. The ability to track cell phones goes hand-in-hand with having one, something a simple Google search indicates quite clearly.

    Nevertheless, when the news about how iPhone users had their locations tracked and stored by their devices, the topic gained a great deal of momentum. Granted, Apple has since addressed the storage of this location data, but the fact remains, iPhones are still quite trackable. In fact, almost every movement of an iPhone user can be followed, provided the user’s phone is on, and, of course, they have it with them.

    Over at CrowdFlow.net, their goal is to organize all of this location data for visualization purposes, something their sidebar states clearly:

    You probably know by now that your iPhone collects the position data of wifi and cell networks near by.

    We would like to combine as many of these log files as possible, create an open database of wifi and cell networks and thus visualize how these networks are distributed all over the world.

    So please contribute your iPhone log files and help us to create an open wifi und cell database.

    Interested parties can donate their location data if they so choose. Meanwhile, CrowdFlow developed the “iPhone Fireflies” video to demonstrate what the movements of 880 iPhones looks like. The results are impressive. As their post indicates, the developers couldn’t decide on a color scheme for the video, so they produced three different ones:


    In the post’s comments section, developer Michael Kreil, who also posted the entry, explains the process in greater detail:

    The geo data of the iPhones are quite accurate, but I only know the locations at specific points in time. So for example I know the accurate position of an iPhone at 12:03 and at 14:27 but I have no clue, how this iPhone had moved in the meantime.

    So my estimation is that an iPhone moves from the last known location at an average speed of 30km/h – in all possible directions. It’s like a diffusion process. That’s why the estimated location becomes more and more blurry and the light fades away.

    And vice versa: If I know, that an iPhone will appear in one hour at a specific location, it should be somewhere nearby now – in a blur with a radius of 30km. (30km in 1 hour = 30km/h)

    And that’s why the image becomes blurry during the night. Most iPhones are not moving in the night; therefore they do not collect data; their positions are vaguer and the lights dissolve.

    Kreil also indicates the process was developed using tools that he wrote using Delphi and a combination of algorithms. The result offers a telling look at just readily this information can be used to produce striking visual recreations; although, it wouldn’t be surprising if some people reacted with trepidation, especially those who get fussy about privacy. Of course, considering these devices have been producing location data since activation — the era, not the phones — at some point, it becomes common knowledge, or at least it should.

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit