Hiperwall Features and Software Development

Now that we have released a maintenance update to our third software release and are closing in on our fourth release (likely this Summer), I’ll comment on how our development has changed and how we focus on what to develop and when.

At the start of 2007, the HIPerWall software primarily consisted of two programs: the original TileViewer, which handle big image viewing, and the very interactive NDVIviewer that displayed regular images, movies, video streams and more — I called it MediaViewer (more details on both can be found in this article).

I was lucky to hire Dr. Sung-Jin Kim back to the HIPerWall project as a postdoctoral researcher, and together we set about transforming the software. Note: When I write HIPerWall, it designates the research project, which is distinct from the Hiperwall company.

Sung-Jin developed a new TileViewer that could handle all the MediaViewer features as well as deal with big images much better than the original TileViewer. He added the ability to rotate anything from a playing movie to a billion pixel image in real-time and interactively. This new TileViewer formed the basis of the Hiperwall technology licensed from UCI to the Hiperwall company. Today’s product, however, bears little resemblance to that old code.

Over the years, many of the thousands of visitors we had to HIPerWall expressed their interest in running their software in high resolution on the wall. When told this entailed lots of parallel and distributed computing programming as well as a significant overhaul of their drawing code, people shied away. We decided we needed a way for people to show their applications on the tiled display without having to rewrite their code. We also wanted to provide the ability to use proprietary programs, like PowerPoint, CAD, GIS tools, etc. One way of doing this is to capture the video output of a computer via a capture card, then stream the screen to the wall. We could already stream HD video, so this was certainly a workable solution, but required very expensive (at the time) capture cards that tended to use proprietary codecs. It would also take enormous network bandwidth to stream a high-resolution PC screen. While we have this capability in the Hiperwall software today, we decided it was too brute-force and inelegant (and expensive) for the time.

I decided to use software to capture the screen and send it to the HIPerWall. I developed the ScreenSender (later changed to HiperSender or just Sender) in Java so it can work on Mac, Windows, or Linux, yet have sufficient performance to provide a productive and interactive experience. While the original Sender was fairly primitive and brute-force, today’s Sender software can send faster than many Display Nodes can handle and uses advanced network technology that lets us have tens of Senders displayed simultaneously without seriously taxing the network.

We also started to improve the usability of the software. Initially, the software could be operated by a few key presses, but as we got more content and more capabilities, we knew we had to make a user interface. Sung-Jin and I defined an interface protocol and made a graphical user interface to allow users to choose content to display and to view and changed object properties for displayed objects.

So we had this powerful software that was starting to gain attention. First, the Los Angeles Times published a nice article on the front page of the California section, followed by a radio interview I did for a radio station that broadcasts National Academy of Engineering content, and culminating in a CNN piece that was repeated around the world.

Somewhere around this time, Jeff Greenberg of TechCoastWorks came along to see if he could help us form a company. Because he has been in the computing technology industry for years, he was able to guide our efforts to make the software easy to use for commercial purposes. Around the end of the year, Samsung became interested in licensing our product, so the real software effort began. While it is okay for research software to crash (in fact, if it does, you can claim that you’re pushing the edge), commercial software has to work as expected, and in this case, 24 hours a day, 7 days a week, for months at a time. Therefore, any memory leaks that would have been okay for a short run in the lab were not acceptable, nor were crashes in corner cases. We also had to work hard to improve performance. In the HIPerWall, we used PowerMac G5s with 2 or 4 processors each and advanced graphics cards (for the time). This was a pretty nice environment for our software, but embedded PCs in Samsung’s monitors were not quite as fast and had significantly less graphics horsepower. We used a small 2×2 Samsung wall as a test bench and made the software sufficiently robust that we demonstrated it on a huge 40 panel wall at the Samsung booth at the Infocomm show in Las Vegas in June 2008. We also had to make the software multilingual, which is not as easy as it sounds, even with Java’s support for Unicode characters. The Samsung-licensed version of the software supports 8-10 languages.

Choosing features to develop has changed from making what we think is cool to making things that will help customers and help sales. Our software still handles gigapixel images with aplomb, but for the many control rooms and network operation centers (NOCs) that use Hiperwall, the popular display objects are Senders (for monitoring whatever it is being monitored) and Streamers (to keep an eye on CNN and the weather). For digital signage applications, regular images and movies are popular, along with Streamers and Senders. In order to coordinate these complex display layouts, we provide a way to save the state of the Hiperwall as an Environment, which can be restored easily.

We also added a Slideshow feature that can contain any of our object types with variable timing. It can even have overlays of a company logo, for example. This feature is popular both for digital signage (step through products, etc.) and control rooms where there may be more information than can comfortably fit on the wall at a given time. (Though the right answer is to buy a larger wall! 😎 )

In response to customer requests, we added scheduling capability to show different environments at different times on different days, etc. UCI’s Student Center Hiperwall system makes tremendous use of the scheduler for their very artistic content.

Another example of our responsiveness to customer needs comes from the large Hiperwall-based Samsung UD system installed at the Brussels Airport. They were using 3 infrared cameras to view passengers along the walkways then show the streams on the tiled display along the walkway, as shown below. One camera was on the opposite side, so the video needed to be flipped horizontally. They used another computer to do the flip, which added some delay. Since such a flip is trivial in today’s graphics cards, we added flip options to the Streamer software, thus eliminating the need for extra hardware and delay.

 

With our next release, we will add many more customer-centric features that will make Hiperwall significantly more powerful, secure, and collaborative, but I will not comment on any here until they are officially announced by the company.

Comments are closed.