Search
Close this search box.

Segway-Ninebot: why now is the right time for computer vision and AI

Segway-Ninebot’s Vice-President of Robotics and Business Development Tony Ho speaks to Zag about the Paris referendum, the dawn of computer vision and AI and why timing is everything.

Share this article

Earlier this month, shared e-scooter schemes in Paris suffered a major setback with the city choosing not to renew the operators licenses.

For Tony Ho – Segway-Ninebot’s Vice-President of Robotics and Business Development – the crux of this issue is about “how to encourage better parking of shared e-scooters, how to help prevent residents from riding on pavements and a vocal minority being fed up with clutter on the streets.”

Ho and global micromobility provider Segway-Ninebot believe the solution to this is Segway Pilot. 

This is an intelligent device that can autonomously sense and detect diverse elements in an urban setting. Applying the technologies of deep learning and computer vision, it comes equipped with a high-performance AI calculate engine from Qualcomm and a wide field-of-view Fisheye Camera. 

Ho, a trained engineer, is something of an expert in this field. “My specialty back in the mid 1990s was in computer vision for robotics applications,” he says. Now he focuses mostly on the micromobility sector with a continuous “itch” to develop new technology.

Zag Daily: What are you doing to help avoid another French exit? 

TH: “What I can tell you is that everybody is now actively looking into this whole computer vision and AI driven technology as a solution. In fact, there’s a little bit of rivalry among fleets because whoever gets that technology first has a higher chance of winning tenders. From our point of view we’re constantly thinking about what we can provide to fleets and to cities that can reduce the amount of complaints, and we feel our Segway Pilot that sits on our S90L e-scooter is the answer. It can detect pavements, pedestrians, and parking spaces all in real-time to help fleet managers monitor how e-scooters are being ridden and to give them more control.”

Zag Daily: How did your idea for computer vision and AI on an e-scooter come about? 

TH: “Back in 2018 I held a talk at Micromobility Europe introducing the world’s first three wheeled robotic e-scooter. Our aim was to solve the sidewalk clutter problem for good. It came equipped with repositioning software that allows remote operators thousands of miles away to move vehicles off the sidewalk and into a proper parking spot. The scooter was well received to begin with – Spin, now part of Tier, actually ordered this e-scooter to deploy in the US. This is an important product for us, however it needs further exploration as the robotic feature was not mature enough at the time. 

“But we are always listening to what the market wants, and what’s interesting is that elements of the product have been very useful today. So it doesn’t drive on its own now, but the brain and the eyes of the robot have stayed the same. The new S90L is essentially the Segway Pilot module and everything put on top of our mature Max series model. And given what has happened in Paris, now there is a definite demand for this.”

Zag Daily: What sort of demand are you seeing for the S90L? 

TH: “Operators are definitely watching their wallets right now as they work towards becoming profitable businesses, so they’re generally more considerate when buying e-scooters. But the demand we see for the S90L is bucking this trend. So far we have received orders from some of the main operators from Europe, North America and APAC.”

Zag Daily: Are you currently working on any new Segway Pilot features? 

TH: “Yes, the Segway Pilot platform will be made available to the broader developer community. An open SDK AI Box will provide a platform for industry participants and clients with independent development possibilities and accesses, thus enabling unique operational and business models.” 

Zag Daily: Do you feel your Segway Pilot is now ready to serve most towns and cities needs? 

TH: “Cities often have different requirements so we want to be able to provide fleet management companies with the ability to pick and choose what they need for their city. There are also different ways to implement this technology – some of the decisions get made on the vehicle side, while other decisions get made in the cloud. The short answer is yes, there’s a lot of capability in there, and in order to expand our reach to more cities, it’s crucial to work together with operators and partners who are aligned with our vision of leveraging AI technology.” 

Zag Daily: Had the Paris operators implemented your AI technology, do you feel confident enough to say that the vote would not have happened? 

TH: “Timing is everything. The reason I’m saying this is because it takes time to implement this technology. Even to this date, we’re still trying to figure out the optimal settings and rules. For instance, if we detect someone riding on the pavement, do you shut them off or slow them down? So in our case, we’re providing a tool to our customer but it takes quite a bit of collaboration to tweak things to work optimally. If all this had been implemented two years ago with our tech, I don’t think the Paris vote would have happened, no. But I can’t stress enough that we can totally avoid this whole Paris disaster. It’s like Moore’s law. You can bet technology will get there, it is only a matter of time.”