Future of AI

The future of autonomous mobility is closer—and more accessible—than you think

Over the past decade, people became obsessed with, skeptical of, and then disappointed by the delayed reality of autonomous vehicles—and the misguided narrative that the media was promoting didn’t help this image, either. 

But away from the public eye, experts and industry professionals knew that autonomous mobility was not the immediate future. Ultimately, the can was kicked down the road, and a new narrative set in: The future of autonomy—while not entirely out of reach—was still years (if not decades) away.

In 2017, I was taking an online course in machine learning and entered a contest to develop software for self driving cars. The challenge consisted of leveraging sensor data to identify objects on the road and steer cars—but, at the time, there was only so much research on how to apply deep learning in LiDAR. People were mostly using a combination of cameras and radar—never LiDAR alone. 

My team, who all happen to be experts in various 3D processing, decided to go all in on LiDAR for the contest. We ended up coming in tenth out of more than 2,000 teams and this challenge propelled us to found Seoul Robotics.

Over time, with our clients who are now 6 OEMs and 3 Tier 1s, we came to a bigger realization that affects society on a much larger scale: The future of autonomous mobility and smart cities isn’t as far off as we had thought. In fact, it’s not only closer than we realize, it’s also safer and more accessible. And it depends on 3D data and infrastructure. 

Making autonomous mobility more accessible 

Sensors are evolving, but even as technology advances and prices fall, LiDAR sensors remain inaccessible to most: They’re expensive, they require sophisticated software that takes millions of dollars to train and refine, and an AV needs dozens of sensors to work effectively. Reducing sensors themselves have proven to be outright dangerous. Both are unsustainable—but thankfully, sensors on the vehicle are not the only route.

While many everyday Americans are familiar with the idea of autonomous vehicles, few are familiar with the idea that cars can be controlled by the infrastructure around them. Instead of using on-vehicle sensors to drive, there is potential for AVs—cars, bikes, scooters, buses—to use the connected 5G systems that now come standard in many models to communicate with the infrastructure that surrounds them, like stop lights, road signs, road barriers, and more. Leveraging this existing infrastructure to become smarter allows us to use a connected ecosystem instead of relying solely on on-vehicle sensors to develop and deploy autonomous mobility solutions. 

Autonomous vehicles were once pigeonholed as an expensive, exclusive solution. But approaching it, instead, through infrastructure ultimately makes it safer, less expensive overall and also more accessible to the everyday person—and brings us that much closer to an autonomous future. 

How is it less expensive do you say? Because sensor and computer numbers only increase with the coverage area, and not the numbers of vehicles. You could literally automate thousands of vehicles with only tens of sensors. And the following paragraph will show how BMW is using our solution to significantly reduce the cost of autonomous vehicles deployment.

Autonomy through infrastructure today—and tomorrow

Autonomy through infrastructure isn’t a far-fetched idea. In fact, it’s already at play right now. BMW, for example, is already using this type of technology for an automated logistics solution at their factory lots: LiDAR drives the vehicle through connectivity and infrastructure. Thousands of vehicles can be automated with just a handful of LiDAR sensors—and although factory logistics might feel removed from our daily life, this deployment is proof of the potential societal impact of autonomy through infrastructure when applied on a broader scale. 

By design, the system is safer, because it always watches a single vehicle from at least four different vantage points – eliminating blind spots, and even helps vehicles to see things way beyond the FoV.

It could be deployed in a large parking lot, where valet parking could become autonomous, in designated hotels, airports, and shopping malls. No more wandering about trying to figure out where you have parked!

It could evolutionize the trucking industry – as a truck enters an Autonomy Through Infrastructure enabled logistics hub, all the parking, loading, fueling, can be done autonomously – making the industry a lot more efficient.

These advancements could power orderly, efficient and safe autonomous driving in notoriously frustrating scenarios, like traffic-heavy parking lots after concerts or championship games. Zoom out, and it could help improve safety on crowded highways or at busy, dangerous intersections. 

And beyond on-road AVs, this application is a glaring opportunity to make cities smarter and achieve increased urban mobility by streamlining public transportation—like shuttles and buses—or automating parking in a cost-effective, scalable way. 

That Jetsons-like future that was initially sold to the public might still be somewhat out of reach. But by approaching autonomy through infrastructure, we—companies, cities, government officials, transportation departments—are currently knocking at the door of a safer and more accessible level of autonomous mobility, one intersection at a time.

Author

  • HanBin Lee

    HanBin Lee is the CEO and Co-Founder of Seoul Robotics, where he oversees the company’s work developing vision software for 3D sensors, including LiDAR. HanBin started Seoul Robotics from a small back alley in Seoul, South Korea in 2017, and since then has brought the company to the world stage, cultivating partnerships with a variety of many global customers, including BMW, Mercedes-Benz, Volvo, and Qualcomm.

    View all posts

Related Articles

Back to top button