2013-07-08

This Digital Telescope Will Blow Your Mind

One year ago, the tallest structure in the EU was inaugurated with cutting edge digital telescopes in the viewing deck. Thanks to some clever hardware encoders and software trickery, visitors to the Shard building can view London as never before. Here’s how they did it.




Standing on the south bank of the Thames right next to the London Bridge rail station is the Shard, the tallest building in the European Union. At a cost of £435 million, the building began construction in March 2009 and was completed on July 5, 2012. Once finished, it stood at 1,016 feet tall. From street-level (and, indeed, eye-level all over London), the Shard is an impressive site. Its triangular glass shape towers over the next tallest buildings in London. But as impressive as the building is from the outside, most people visit the Shard to see the view from its 72nd-floor observation deck. On a perfectly clear day, the naked eye is able to make out the blue waters of the North Sea and the English Channel and see thousands of Londoners and buses and black cabs that look like toy cars on the streets below.

But let’s be real--this is London, and there is rarely such a thing as a clear day. So the owners of the building’s observation deck (appropriately named “The View from The Shard”) knew that if they were going to charge people £25 a head to come up and see the view, they would need to make sure the view was always beautiful. Of course, no matter how far technology has come in the last 10 years, we still can’t change the weather. That’s why the Shard turned to Canadian company GSM Technologie and their newly invented digital augmented reality telescopes, called a Tellscope.

A New Kind Of Telescope That Runs Software

The idea behind a Tellscope is not simple: Gone are the cylindrical tubes of old telescopes that were invented in the 1700s. Instead, you look only at a touchscreen the size of an iPad. As you swivel and pan it around you can see the cityscape as it looks on a perfectly clear day or night, no matter what the time of day, no matter the weather.

Yes: This telescope sees through weather.

In fact, the Tellscope has four modes: Live, Day, Sunset, and Night. Live mode works as a normal telescope would. The user looks at the screen and sees what is happening in the city below them. They get all the fog and rain and other weather as it happens. The live view is thanks to an industrial-strength motion capture camera (like ones that are used to film Hollywood movies) housed in the Tellscope’s body. Using physical zoom buttons on the Tellscope’s handles, the viewer can zoom in or out to see details across the city.

However, this is London, and the weather is in charge. If it’s foggy outside, it doesn’t matter how advanced the camera is. But that’s where the Day, Sunset, and Night modes come in. If the weather or time of day is not to your liking, simply tap a different mode and you’ll be presented with an augmented image of London as it looks on its most beautiful day or crystal clear night. You can still pan around the city just as you would in the live view, and zoom in and out.

And as you might have guessed from the “Tell” in “Tellscope” the device doesn’t only provide all-weather views, it provides augmented reality overlays that allow the viewer to touch a building right on their screen and see information about it. For example, a viewer could touch St. Paul’s Cathedral and a pop-up will appear on-screen that tells the viewer what they are looking at and also some history behind the building.

Sure, the augmented reality overlays are informative, and the seamless switching from live view to a stunningly realistic pre-captured 3-D panorama image of London at day, sunset, and night are impressive. But the illusion the Tellscopes gives of perfect weather isn’t what makes them so interesting--it’s how it does it that amazed me.

How This Telescope’s Software Creates Illusions Within Illusions

When I first went to The Shard and used the Tellscopes, I assumed they worked as they look like they do: you pan the device, a laser rangefinder (commonly used by civil engineers) hits the target building, calculates the distance to tell the Tellscope what the building is, and presents an overlay pop-up on the device’s touchscreen that allows you to read more information about what you are looking at.

But then I looked at the front of the Tellscope and found no hole where a rangefinder would be hidden behind. Besides, would the Mayor of London allow people to shoot lasers from the tallest building in the city?

Then I thought perhaps it works by optical recognition. The software obviously identifies buildings (much like OCR works on text) and it knows what they are that way. But then a wisp of cloud drifted by me, yet the Tellscope could still tell me what building I was viewing through the fog. I was stumped. How does this thing actually work?

“The Tellscopes don’t actually recognize the scene,” says Julian Choquette, product development engineer of the Tellscope. “It gives the illusion that we’re doing that, but it’s actually all mechanical tracking inside the Tellscope through calibration and the software.”

Choquette says that all the Tellscopes actually recognize are its pan and tilt position and its fixed location on the observation deck, called an “index” position. As far as what the Tellscope “sees” through its camera is completely irrelevant to its operation. Inside the Tellscope are two shaft hardware encoders. The horizontal shaft encoder is used to report the pan position of the Tellscope and the vertical encoder reports the tilt position. Each encoder has a 0.036° precision, which translates to a 0.05 mm movement of the Tellscope by a user.

“These encoders are responsible for tracking the position of the Tellscope in space,” Choquette says. “Essentially, when you move it left and right or up and down, it knows how many degrees you have moved the device in real ­time. Here’s where the tricky part comes in. We construct a virtual reality scene from the point of view of the Tellscope which we call our reference scene. All the points of interest are mapped out on this scene via their respective angular coordinates. The Tellscope is then calibrated to this reference scene via several control points. Now, as you move the Tellscope in Live view mode, the Tellscope is actually thinking about this in its reference scene. The points of interest overlaid on the screen correspond to their position in the reference scene and also the live view since the Tellscope was calibrated to match both. This is done several times per second (as fast as the eye can process images) and the illusion is perfect. It would be too error­-prone to actually recognize the scene. So we decided to do a sleight-of-­hand trick to achieve our goals.”

“Getting panorama to actually work with a group of Tellscopes was pretty complicated,” Frédéric Savard, software engineer of the Tellscopes, tells me. “To find the right way to calibrate the group together so that they can share a panorama without creating some distortion of the image was not an easy task.”

The panorama’s Savard is talking about are the ones programmed into every Tellscope. In addition to the live feed from the Tellscope’s camera, each Tellscope in the Shard also has Day, Night, and Sunset views. These views are actually pre-shot 3-D panoramic spheres, each over a gigapixel in size.

“A challenge was to create a highly efficient multi­core panorama viewer to display the gigapixels panoramas,” Savard explains. “The difficulty here is to correctly manage the access to the user interface from multiple concurrent background threads. The fact that tiles loading can be scheduled and canceled at any moment when the user moves the camera quickly increased the complexity of the viewer.”

But due to some clever coding in Microsoft WPF, which the Tellscopes software is written in, Savard could thread the pre-captured 3-D panoramas with the live feed from the Tellscope’s camera to allow for seamless switching between the four different views while concurrently allowing the Tellscope to know what it was looking at thanks to the coordinates from the hardware encoders. From a software engineering perspective, this is something that has never existed before, and it’s what enables the Tellscope to know what it’s looking at without actually seeing a thing.

The Difficult Realities Of Prototyping Telescope Hardware

Of course, the software and the illusionary optics weren’t the only challenges of building the telescope of the 21st century. As with prototyping any new device there were a lot of practical challenges to overcome. For starters, there’s a full-blown computer inside each Tellscope. When Choquette began prototyping the first Tellscope in a wooden mockup in 2009, it used an industrial grade motherboard (as did the first production Tellscopes). For the latest iteration of Tellscopes Choquette was able to move to an off-the-shelf component.

“The benefits surrounding the use of an integrated graphics card outweighed the warranty advantages conveyed by an industrial motherboard. We use some of the smallest motherboards out there, the mITX standard,” Choquette explains. “The rest of the system is equipped with a multicore processor, a screaming fast solid state drive, and lots of RAM. At the moment, we don’t use our systems at 100% capacity. The core of the software was made to run on 2009 processors and in 2013, we can get much more powerful CPUs at the same price point. This gives us some flexibility for further software improvements that could tax the CPU a bit more. The fast GPU and RAM helps us display our panoramas on screen rapidly, even when they are gigantic.”

But given that these Tellscopes are operating 24 hours a day, 365 days a year, all those internal computer components led to a big issue: heat.

“Heat was the topic of a year’s worth of research and development,” Choquette says. “The solution involves different components and a lot of testing. In the previous generation, the graphics card and processing unit were different entities. This created additional heat generation and required many fans to redirect this heat outside the chassis. With the magic that goes on behind Intel labs, we were able to eliminate the dedicated GPU and use the one now embedded with most mainstream CPUs. Due to this, the total heat generation was reduced. Still, we needed to design a robust, durable, and compact fan to get that heat out of the combined CPU and GPU. This is one example where we use industrial-grade components. This heat, which has exited the computer chassis, must now exit the Tellscope chassis. Our research identified a combination of ducts that allowed us to push out this heat efficiently without creating too much noise. Naturally, our focus is to create a device that runs smoothly all year round. Luckily, heat has never been an issue so far thanks to the careful design.”

And standing 72 floors above London, looking through the Tellscopes, it’s that careful design and clever software engineering that makes the end result so stunning. The hardware and software engineering inside each Tellscope is a brilliant way to get around the fundamental weather influence a local climate can have on traditional optical telescopes. And for Londoners, that means the first truly new telescope of the 21st century will allow them to view their cityscape as never before.

Image credit: Shard photos © 2013 Jose Farinha.


Article Tags: The Shard