Apple takes TechCrunch for a ride — updated

Skip the arm-waving about rebuilding Apple Maps from the ground up. Go straight to Matthew Panzarino's ride-along:

Earlier this week I took a ride in one of the vans as it ran a sample route to gather the kind of data that would go into building the new maps. Here’s what’s inside.

In addition to a beefed up GPS rig on the roof, four LiDAR arrays mounted at the corners and 8 cameras shooting overlapping high-resolution images – there’s also the standard physical measuring tool attached to a rear wheel that allows for precise tracking of distance and image capture. In the rear there is a surprising lack of bulky equipment. Instead, it’s a straightforward Mac Pro bolted to the floor, attached to an array of solid state drives for storage. A single USB cable routes up to the dashboard where the actual mapping capture software runs on an iPad.

While mapping, a driver…drives, while an operator takes care of the route, ensuring that a coverage area that has been assigned is fully driven, as well as monitoring image capture. Each drive captures thousands of images as well as a full point cloud (a 3D map of space defined by dots that represent surfaces) and GPS data. I later got to view the raw data presented in 3D and it absolutely looks like the quality of data you would need to begin training autonomous vehicles...

One of the special sauce bits that Apple is adding to the mix of mapping tools is a full-on point cloud that maps in 3D the world around the mapping van. This allows them all kinds of opportunities to better understand what items are street signs (retro-reflective rectangular object about 15 feet off the ground? Probably a street sign) or stop signs or speed limit signs.

It seems like it also could enable positioning of navigation arrows in 3D space for AR navigation, but Apple declined to comment on “any future plans” for such things.

Apple also uses semantic segmentation and Deep Lambertian Networks [huh?] to analyze the point cloud coupled with the image data captured by the car and from high-resolution satellites in sync. This allows 3D identification of objects, signs, lanes of traffic and buildings and separation into categories that can be highlighted for easy discovery.

The coupling of high-resolution image data from car and satellite, plus a 3D point cloud, results in Apple now being able to produce full orthogonal reconstructions of city streets with textures in place...

The new version of Apple Maps will be in preview next week with just the Bay Area of California going live.

My take: At this rate, at this resolution, with human editors in the loop, Apple will never get to Western Mass.

UPDATE: With regards to Apple ever making it to my neck of the woods, reader Jeff Larvia sent me the photo below, shot a year ago at a gas station in Holyoke, Mass., 30 miles away. "Maybe they mapped your street already," he suggests.

I doubt it. That gas station may be in Holyoke—but it's also conveniently located at the intersection of two major U.S. highways: 91 (N-S) and 90 (E-W). I wouldn't be surprised if that Apple car were just passing through. Google cars have ventured further north, but none has ever come near my street.


  1. Scott Davis said:
    Much to my surprise, I saw one of these APPL cars in Northeast Ohio a few weeks ago. I have no idea how they decide where to map first. I’m sure they have a plan, but I sure don’t have a clue what it is.
    – Dr. Scott

    June 29, 2018
  2. Roger Schutte said:
    I saw one in Augusta, Maine 2 weeks ago. Given that, they should be able to find western Mass sooner than later.

    June 29, 2018
  3. David Drinkwater said:
    From two to ten years old, I lived in Amherst, MA. Never heard of Hadley grass. I’m sure population and wealth density in that area are high enough to get you mapped, though.

    June 30, 2018

Leave a Reply