The recent advances in AI-based vehicle navigation and the vision shared by many automotive companies towards fully self-driving cars, open up the opportunity for transforming cars to something more than a mere transportation mean. As the car driver is transformed with the help of technology into just an additional passenger, the car space as a whole can be transformed into an entertainment room, a temporary office space or even an area for relaxation thus reshaping what was wasted commute time so far into high valued time. The opportunities that open up for introducing AR/VR technology towards implementing this vision are tremendous.
Especially the network capabilities brought by 5G, such as ultra low latency and very high bandwidths can already support the requirements of AR and VR applications. 5G deployment is thus the second link required besides self-driving car technology. Until self-driving car technology becomes more established, 5G capacity can anyway bring to life the vision of immersive in-car entertainment for the passengers which may span from immersive movies and shows to live concerts, sports, and interactive gaming. The requirements set by such applications on the communication network are summarized as follows:
In this context, and as a first proof of concept VR application enabled by 5G connectivity, within 5G-IANA project HIT HYPERTECH INNOVATIONS will be developing and showcasing a VR application where virtual reality users will be virtually joining an actual tour guide in a double decker bus and will be represented in the VR space with their avatars. Users will be able to receive to their HMD (Head Mounted Display) the video of the tour surroundings streamed by a high resolution 360 camera mounted to the bus/vehicle taking the real tour. An innovative adaptive delivery scheme for 360◦ Videos using users’ field of view prediction will be utilized to reduce server resources expended. Moreover, historical information and trivia about the attractions will be triggered and provided along the path, through GPS-driven landmark indicators. The users, via their avatars will be able to gesture, speak and listen to one another, from their dedicated virtual bus seats, which will be determined during their entry.