ABOUT

Use Cases

Use Cases

5G-IANA tools and components will be demonstrated through 7 highly demanding Automotive-related use cases:

UC1 is about the integration, demonstration and validation of advanced remote driving functionalities in the open and enhanced experimentation platform developed in the 5G-IANA project. The aim is to use a vehicle connected through 5G, which is controlled remotely via a teleoperation platform.
In a first phase, the vehicle will be equipped with both a front and a rear camera to transmit the video to the edge of the 5G network. The vehicle to be used in this UC is an automated guided vehicle (AGV) with an “Ackerman” configuration, that is, the rear wheels provide traction force to the car, while the front wheels are adjustable and guide it. The 5G enabled vehicle will be connected to the edge of the network, sending information based on its on-board sensors and video (constant feed). At the edge, an AI/ML algorithm will be processed and added on top of the video, providing information about the different elements located while driving on the road, such as pedestrians, cars, or traffic signals. An additional warning feature will be included by the use of sensors and lidar located in the vehicle, which permit to measure the distance to obstacles and provides the driver additional information and/or stopping when a potential accident is about to happen.
The second phase will additionally include a series of advanced features that aim to push 5G to the next level. This includes the integration of two additional video feeds (right and left side) into a 3D environment. It will be processed from the four video cameras installed in this phase (front, left, back, right) and shown to the end user in a 360 environment by using VR glasses. Apart from that, the warning service will be enhanced by implementing a 3D tracking algorithm that will use both the data from the AI detection algorithm and lidar.
he use case aims to showcase a manoeuvre coordination service, available thanks to the 5G-IANA infrastructure, capable of lowering the risk of collision in complex junction scenarios by describing suitable paths and priorities for connected, eventually automated, vehicles directed by a shared coordination system. It facilitates AVs and human driven cars interaction at gatherings, complex intersections, and clogged traffic. The use case will include autonomous and traditional cars together with Virtual Vehicles that can be put in at will to recreate realistic traffic conditions. Virtual simulated vehicles, fully integrated into the AOEP, will also serve as a powerful tool to facilitate experimentation throughout the 5G-IANA platform ecosystem.
This use case facilitates an immersive virtual bus tour experience for virtual reality headset users. The users select an avatar to represent them in the VR space and join a guided tour on a double decker bus. The avatars are able to gesture, speak and listen to one another, from their dedicated virtual-bus seats, determined during entry. The users receive on their HMD (Head Mounted Display) the video of the tour surroundings, streamed through a high resolution 360o camera, mounted onto the bus, while it performs the “real-life” tour. During the tour, via the VR applications, the users receive GPS-driven landmark indicators with information about landmarks and attractions, learning more about the landmark’s history in an interactive manner.
This use case refers to Municipalities / City Councils and other touristic property administrators, to enhance campaigns or advertisements aiming to increase tourist inflow, as well as companies in the tourism industry, to enhance the value chain and create new revenue streams.
This infotainment application makes accessible advanced uses of the VR technology to a much wider audience and builds, not only a content platform which can produce revenues from tourists, but also a social platform and community. This offers many creative and inclusion opportunities, as multiple participants are enjoying touristic attractions using the same space, talking and listening to each other in real time, sharing experiences and interacting naturally as being truly in the same space. Users can experience the world in a new and engaging way and participate in touristic services without the limit of space and time.
This Use Case is being tested in the 5G-IANA platform to optimize the above service chain, aiming to reduce the bandwidth and computational resources required per user for a high resolution 360o stream. The bandwidth required per user for a high resolution 360o video stream is a hindering factor in the deployment of 360o video application for multiple participants.
UC4-ACOV Intelligent NetApp aims at providing “high-quality AR content streaming” taking advantage of the future web AR applications, the MEC and 5G connectivity. For a UE entering a 5G network, the Intelligent NetApp will provide information in the form of “high-quality” virtual 3D objects embedded on the user’s (e.g. Google) 3D map on his mobile device (e.g., iPhone/iPad). This is an excellent example of convergence of edge computing, the ARCore Geospatial API and 5G networking and edge computing that will give users the ability to interact with content, digital character displays, virtual experiences and outdoor navigation. Instead of using Google cloud, we will leverage MEC server so that the experience is rendered on powerful, edge-based GPUs and then streamed to any mobile device. Augmenting and offloading the processing from the mobile device provides the best user experience. Thus, the NetApp will use a combination of edge computing, 5G networking and AR technology to offload the computing power needed to display high-quality 3D objects, rendered by unreal engine, and stream them down to AR-enabled devices. The 3D-objects streaming will be provided to the 3D navigation environment similar to Live View by using ARCore Geospatial API and MEC server. This is challenging for AR applications so that to support marker-less AR streaming content with the help of MEC elements in order to minimize the battery consumption of the mobile device. Apart from that and in case of using an on-board unit, where the battery issue is not considered anymore, the 3D objects streaming is useful in case of real-time collaborative experience with other users and through a unique 3d object descriptor. The Intelligent NetApp will also take into consideration the coverage of the offered network in Ulm combined with the user’s speed so it will adjust to the system requirements.
Use Case 5 (UC5) aims to develop a feature that will be integrated in the 5G-IANA platform, which will detect aggressive and distracted driving (hazardous events), and transmit warning notifications on road risk-level to other vehicles. UC5 leader, OSeven, has a long experience in the development of telematics and driving behaviour assessment products, having developed an innovative smartphone application, which rates a driver’s driving behaviour, in terms of safety and eco, according to specific metrics (e.g., speeding, harsh breaks/accelerations, distraction/mobile use).
The two kinds of risky behaviour the UC5 novel feature aims to detect are:

  • Aggressive driving: harsh braking, harsh acceleration, speeding, crashes
  • Distracted driving: mobile use.

UC5 will use two different sources of data: i) the driver’s mobile phone, which will provide data from the smartphone sensors (e.g., gyroscope, compass, accelerometer), and ii) an OBU installed on the vehicle, which will provide GPS and position data.
OSeven will develop an ML model to be trained on the edge, for the assessment of road risk level, that will be combined with real-time notifications. This ML model will assign a risk level along roads based on aggregated data over a specified model training period. In addition, UC5 will employ real-time hazardous events detection, in order to inform drivers on the increase of risk levels on the road they are driving on.

The developed feature will inform drivers in advance regarding risky and distracted driving in the road network, while also improving driving driver information, increasing their awareness and “decreasing” their reaction time. It will provide this information on a many-to-many integrated approach, detecting hazardous events from many vehicles and notifying many vehicles in real-time. In addition, it will assist drivers in enhancing their visibility in low visibility cases, such as fog, and low lighting conditions.

The goal of UC6-NSTAT is to provide a network monitoring service that can be used to increase the efficiency of other NetApps through providing distributed predictions of QoS and in general of network conditions at various locations.
This NetApp provides an overview of the status of network components or virtual network functions and draws conclusions and predictions with respect to the performance of the monitored components. It utilizes network communications to deliver predictions of the network quality to a central computation entity at the MEC server. This NetApp has the goal to minimize the data collection effort through utilizing a distributed Machine Learning approach, i.e., instead of collecting large amounts of network monitoring data to be centrally analysed, the ML analysis/prediction model is distributed on the VNFs located at the RSUs and the vehicle OBUs. The goal of the ML model is (1) to learn data traffic patterns for data traffic prediction, (2) to learn network condition models to provide QoS predictions, and (3) to learn to distinguish between normal and abnormal network behaviours to detect and predict faults.
Use case 7 (UC 7) aims to develop and integrate necessary components (e.g., VNFs, RSU, OBUs, sensors, camera, 5G SA network, and 5G MEC) to provide situational awareness for first responders intervening in the cross-border road tunnel. In case of an accident in the tunnel, situational awareness systems will enable first responders to understand specific conditions of the tunnel incident such as number of involved vehicles, temperature, humidity, certain gases levels, etc.
RSU and sensors connected to it, as well as OBUs, will provide environmental data sensed in a tunnel, including real-time video-stream. VNFs deployed in RSU and OBUs will take care of collecting and initial processing of the collected data and will provide data transmission to the MEC where monitoring, analytics and streaming VNFs will take care of collecting and further processing these data.
One of the most important benefits expected to be achieved by using orchestrated 5G network is cross-border collaboration of first responders which is in real life still faced with many issues as a consequence of each single PPDR administrative domain operating its own communication system (or, at least incompatible with the cross-border one). Utilizing 5G system in both/all PPDR administrative domains will help creating applications that all PPDR users should be able to consume. All components forming the solution will be generic and therefore applicable to any 5G network providing required conditions (e.g., eMBB network slice, latency requirements, etc.). To enable cross-border data sharing, the two networks will need to be connected in a proper way (e.g., VPN), as well, certain security rules will need to be established in order to prevent any data privacy or similar issues while transferring data cross border.