Volvo Drive Me Project on Hold

Swedish car manufacturer Volvo has not received authorization for the Drive Me project in Gothenburg.
It was planned for years, but now faces an unclear future: Volvo’s Drive Me project intended to outsource semi-automated XC90 models to families who could use the technology as beta testers in Gothenburg. Tests of this kind were also planned in London and China, but nothing will happen in Sweden for the time being.
The Drive Me project was expected to yield essential insights from the experience of the real world providing 100 families with autonomous vehicles, which would then be used, for example, to commute to work. Production of the pilot-assisted XC90 models began in 2016, with tests scheduled to begin in 2017.
The project was supported by a number of cooperation partners who are also involved in the technology development. One of them is the supplier Autoliv, with whom Volvo has founded a whole new company to market autonomous driving: Zenuity.

Families as Test Objects?

The tests had not been approved yet, but Volvo still expected to get the permission. Therefore the families were already introduced to the technology. Latter was stated as the official reason for the delay, until the Swedish transport authorities (Transportstyrelsen) officially refused their permission for the tests. The reason: semi-autonomous technology is still too unsafe to equip families with.
Swedish authorities cited legal difficulties and the question of liability as reasons for the ban on tests. It can be assumed that the fatal accident in Arizona was also one of the reasons for the provisional end of the tests. Investigations have shown that the driver was watching a TV show on her mobile when her autonomous Uber caused the accident. Uber has received many test cars from Volvo, and the car involved in the crash was one of those. The fear of further accidents in Gothenburg will have played a role in the decision of the authorities.

Reaching Level 4 with the Highway Pilot

Meanwhile, Volvo has announced that it wants to reach Level 4 (high autonomy) by 2021 – at least on highways. The company is now working on the Highway Pilot – the successor of Pilot Assist. Here drivers shall be able to devote themselves to other things while driving.

About the author:

David Fluhr is journalist and owner of the digital magazine “Autonomes Fahren & Co”. He is reporting regularly about trends and technologies in the fields Autonomous Driving, HMI, Telematics and Robotics. Link to his site:

Human-Controlled Vs. No Control At All

Some automakers are toying with how to completely eliminate pedals and steering wheels. It sounds interesting on paper, but what if the car’s safety features fail and a passenger needs to intervene? Intervention is thought to be a huge danger to AVs, but would cars really be better off without any human control?

“I think that’s the question everyone has,” said Michael Schuldenfrei, a technology fellow at Optimal+, a big data analytics company. “Even an aircraft, which can do just about everything automatically – takeoff through landing – even then people still have manual overrides for everything. No one is pretending you can do it completely automatically.”

Maybe not, but Christophe Begue, director of solution strategy and business development at IBM’s Global Electronics Industry Team, said that you could argue it’s possible to remove pilots. Automobiles are a whole other story.

“Driving a car is more complicated than flying a plane,” said Begue.

Between traffic lights, intersections, varying road designs and fluctuating speed limits, it’s easy to see why. Planes are also dwarfed by the number of cars jamming the world’s biggest roads, which adds to the complex nature of driving.

“I think it’s going to take quite a bit of time unless it’s in a very controlled environment, like a controlled campus,” Begue added.

Controlled environments are already being tested. But while there are some shuttles that offer low-speed, pedal and wheel-free mobility in a geo-fenced environment, most automakers are not yet willing to drop human controls.

“I think what you see is everyone is being very aggressive about going to completely autonomous level 5 autonomy in the vehicle,” said Schuldenfrei. “And then as soon as something happens, they back away very fast. The Uber [incident] is a classic example.”

Schuldenfrei does not expect an override-free AV to arrive as quickly as the hype suggests. However, he is concerned that no amount of control will keep passengers safe if something goes wrong.

“Even if you have an override, when the car is driving itself, you’re already opening up tremendous risk that the driver won’t even be concentrating,” said Schuldenfrei. “So if something goes wrong, the driver [may not] notice.”

Schuldenfrei also thinks about the classic moral dilemmas faced by human drivers. If a human driver has to crash into a tree to avoid hitting a bunch of kids, he or she will likely do so without thinking twice. It’s a natural instinct. Autonomous cars would have to be programmed to do the same. No amount of machine learning will change that.

“So what do you do: do you drive into a tree and kill the driver or drive into the kids and kill the kids?” Schuldenfrei asked. “On the flipside, where is the liability and to what extent does it play a role? If you look at the statistics, autonomous cars are significantly safer per mile, per kilometer driven than human-driven cars. But it doesn’t mean they’re not going to make a mistake ever or that there won’t be a situation where you’ll be in an accident.”

Begue thought about all this for a moment. It takes a lot to develop an autonomous car, but the progress has been impressive, to say the least. New AV tests are cropping up all over the place.

“I live in San Francisco,” said Begue. “I cannot go out into the street and walk for more than 30 minutes without seeing one or two of these autonomous cars. Five years ago, would I have imagined that I’d see one every day? Probably not. Things are moving pretty fast.”

About the author:

Louis Bedigian is an experienced journalist and contributor to various automotive trade publications. He is a dynamic writer, editor and communications specialist with expertise in the areas of journalism, promotional copy, PR, research and social networking.

Startup PerceptIn developed an Autonomous Vehicle for less than $10,000

Until now, the race for autonomous driving was mainly about vehicles that can drive on public roads. But there also are alternative models that target other locations.

According to most studies, Waymo (Google’s subsidiary in the Alphabet Group) and General Motors are leading the competition for the introduction of autonomous driving. Both want to develop a driving service with robot taxis that targets vehicles that can drive on all roads.

Competition for Speed

However, this still requires some development work. The faster a car drives, the better the technology must be around it – this also applies to autonomous driving. Fatal accidents can occur at high speed, as the recent accidents involving Tesla and Uber have shown. Fast driving autonomous cars require efficient sensors that are able to “see” what is happening in the distance. This applies to all sensor types such as radar, camera or Lidar sensors.

Additionally, the data collected by the sensors has to be processed promptly in order to initiate the right actions at high speed in an emergency – not to mention the braking distance. This also implies a high computing capacity.

Why not go slower?

The problem is that powerful sensors and high computing capacities are very expensive and this is an obstacle to the technology. That’s why startup Perceptin follows another path, making it possible to significantly reduce the costs for an autonomous vehicle. Former Baidu employee Shaoshan Liu founded Perceptin in 2016 with the goal of creating a reliable vehicle that should not be used on public roads but in confined areas – i.e. on university campuses, on company grounds or in parks.

To this end, the company developed an electric car that can be produced for less than $5,000 in China. Hardware and software for autonomous driving doubles the price. Here the factor speed comes into play again. The slower the vehicle, the less computing capacity is needed. It does not use a Lidar system either, but inexpensive camera, radar and ultrasound technology.

The omission of Lidar sensors is compensated by camera technology that enables a 3D image via point clouds. This is certainly not suitable for high velocities, but is perfectly fine for vehicles not faster than 20 km/h. Its position is determined with an accuracy of 20 centimeters by GPS and an autometry sensor. The medium-range radar can only see 50 meters and the inexpensive ultrasonic sensor has a range of five meters.

Sales and Customers

Perceptin has already found its first customer – ZTE. The Chinese telecommunications company purchased five units, which will be used on its own premises. Overall, Perceptin hopes for sales figures in the six-figure range.

About the author:

David Fluhr is journalist and owner of the digital magazine “Autonomes Fahren & Co”. He is reporting regularly about trends and technologies in the fields Autonomous Driving, HMI, Telematics and Robotics. Link to his site:

Introducing Tesla 9.0

Tesla introduced a new update – autonomous driving features included.
Recently the Tesla project started to falter as it was hit by economic problems, lawsuits and not least by the fatal crash of a car controlled by autopilot. Now Tesla announced a new update to manage the turnaround.
Tesla’s autopilot was one of the first sold in line and offered a Level 2 semi-autonomous driving solution. The cars were able to drive by themselves but the manual requested the human driver to keep their hands on the wheel and stay alert. The driver had to be able to take over control at any time because technology was not ready to master all traffic situations.

Autopiloted accidents

Lately there has been an increase of incidents involving the Tesla Autopilot. Latter seems to tempt humans to lose their attentiveness in traffic and take up other activities. In Great Britain, a driver lost his license for sitting on the passenger seat whilst driving in an autonomous car.
On March 23, 2018 in California, a Tesla Model X car crashed into a freeway divider, killing the driver and causing a fire. Recently the National Transportation Safety Board (NTSB) of the USA took over the investigations in the case – their report stated that the driving was not paying attention. The car was oriented towards the car ahead and lost its point of reference at the divider. At that point, the driver had to take over control of the vehicle – unfortunately, he didn’t. The car accelerated, pulled to the left and crashed into the divider.
There have been more incidents involving Tesla models causing collisions by accelerating for no apparent reason. One reason for this could be a faulty update. Many users sued Tesla for selling life-threatening technology; these cases have been settled out-of-court. However, the company is facing more lawsuits initiated by consumer protection organizations.

Update 9.0

Elon Musk announced on Twitter that there will be a new update starting in August involving new autonomous driving features. What those are and what autonomy level the Tesla models will reach with them was not communicated. The software update does not provide the integration of a lidar system. A lidar enables a 3D image of the vehicle environment and serves for orientation and position finding. Last year, Tesla was denied the ability to build autonomous cars by a GM manager because Tesla cars aren’t equipped with lidars. In fact, Tesla uses camera sensors above all.
However, researchers at Audi and MIT have developed algorithms that allow 3D information to be calculated from the camera images. Whether that is Tesla’s plan is unclear, but it cannot be ruled out. We can just hope that the update in August will provide more safety. Not only Tesla’s reputation is at stake, but also that of autonomous driving technology as such.

About the author:

David Fluhr is journalist and owner of the digital magazine “Autonomes Fahren & Co”. He is reporting regularly about trends and technologies in the fields Autonomous Driving, HMI, Telematics and Robotics. Link to his site:

Waymo takes the Gloves off

Google’s subsidiary Waymo is considered a market leader for a technology that is not marketable yet. That means Waymo is closest to commercializing autonomous driving, according to several research studies. The company plans to set up on-demand transportation services as a competitor for conventional cabs. Originally based in California, Google/Waymo got the authorization to take the project to Arizona.

Waymo’s vehicles

Initially Waymo wanted to build its own cars but eventually moved away from the idea. Today Waymo’s fleet consists of Fiat-Chrysler (FCA) models, more precisely of Chrylser Pacifica vehicles. Additional collaborations with Jaguar-Land-Rover and Honda are following the same model. It’s likely that the Jaguar models cover the luxury segment, FCA the mid-range segment and Honda the compact car division. Lately Waymo ordered 62,000 new Chrysler Pacifica vehicles that are currently being equipped with autonomous car technology. Waymo said nothing about acquisition and modification costs, but industry experts estimated the costs at about 31 billion USD – for modification purposes only!

Waymo & Uber?

Meanwhile Uber raised its hand and proposed a collaboration with Waymo, which seems utopian after the 2 companies settled a legal dispute forcing Uber to pay Waymo 2.6 billion USD. Subject of the lawsuit: stolen trades secrets, in this case information about a Lidar.

Moreover, Uber has to take responsibility for a fatal crash in Arizona, where an autonomous car ran over a woman due to a deactivated emergency brake system. Therefore, Uber aborted all testing activities in Arizona and won’t resume testing in the state. The incident cost a lot of trust in the technology – this led to stricter testing regulations with the USA.

When is the time?

It is questionable that Waymo will be able to offer an autonomous transport service in 2018. The modification of more than 60,000 FCA vehicles is expected to take more than 1 year. However Waymo should not take the foot off the gas, main competitor GM plans to mass produce a highly-autonomous vehicles (Level 4) in 2019.

About the author:

David Fluhr is journalist and owner of the digital magazine “Autonomes Fahren & Co”. He is reporting regularly about trends and technologies in the fields Autonomous Driving, HMI, Telematics and Robotics. Link to his site:

Don’t enter Carwash Facilities with Autonomous Vehicles!

Countless experts and organizations are talking about autonomous driving, although we haven’t seen it in its final form yet. Today, we are still stuck at Level 3 automation, but Level 4 seems to be in reach.
Virtually all players of the autonomous vehicle market are testing their cars on the streets or with simulators. Latter enable companies to simulate any traffic situation imaginable and evaluate the car’s behaviour. However real traffic still holds scenarios that cannot be simulated as seen with the fatal crash of an Uber in Arizona. In this case the misbehaviour was caused by a software bug, but sensors can also fail when they are dirty after a completing a long distance without cleaning.

Pollution as a main Source of Error

Sensors have to be cleaned regularly in order to work properly. If they are covered with dirt, autonomous cars can barely „see“. Now one could think: “No problem, I just let the automatic carwash do the job.“ Unfortunately autonomous vehicles are not allowed to enter carwash facilities as Futurism found out.
The cleaning brushes could dislodge external sensors entirely and strip the car its ability to locate itself, other objects and road users. In addition soap or water leftovers on the car may interfere with the sensors‘ functionality and lead to false interpretations of the environment. It’s also a matter of costs – although Lidar sensors are getting cheaper, they still cost 5-digit sums. Imagine the costs if some autonomous cars got their Lidar swept off the roof now.

Self-Wash your Self-Driving Car

Companies like Waymo and Uber confirmed that they hired personal to clean their fleets manually instead of using automatic carwash. In order to protect sensors, they are treated with mircofiber and special cleaning liquids. Of course this is not a mass-market solution. Just think of future traffic ruled by autonomous vehicles that have to be cleaned by hand. The industry is already working on automatic cleaning units, that start operating as soon as there is dirt or smear detected on a sensor.

About the author:

David Fluhr is journalist and owner of the digital magazine “Autonomes Fahren & Co”. He is reporting regularly about trends and technologies in the fields Autonomous Driving, HMI, Telematics and Robotics. Link to his site:

Interview with Prof. Dr. Daniel Cremers: Technical Challenges in Developing HAD?

During the Automotive Tech.AD 2018 we had a chat with Prof. Dr. Daniel Cremers, who holds various roles at the Technical University, Munich. Prof. Cremers gives us a better understanding of his roundtable discussion around the question “What are the most serious technical challenges in developing HAD?”. He addresses the many requirements on the way to autonomous cars. Cars need to understand what is happening around them. That requires to reconstruct the car’s surrounding in order to realize actions like path planning, decision making or obstacle avoidance. Moreover he speaks about the role of deep neural networks in future self-driving cars.

About Prof. Dr. Daniel Cremers

Since 2009 Daniel Cremers holds the chair for Computer Vision and Pattern Recognition at the Technical University, Munich. His publications received several awards, including the ‘Best Paper of the Year 2003’ (Int. Pattern Recognition Society), the ‘Olympus Award 2004’ (German Soc. for Pattern Recognition) and the ‘2005 UCLA Chancellor’s Award for Postdoctoral Research’. Professor Cremers has served as associate editor for several journals including the International Journal of Computer Vision, the IEEE Transactions on Pattern Analysis and Machine Intelligence and the SIAM Journal of Imaging Sciences. He has served as area chair (associate editor) for ICCV, ECCV, CVPR, ACCV, IROS, etc, and as program chair for ACCV 2014. He serves as general chair for the European Conference on Computer Vision 2018 in Munich. In December 2010 he was listed among “Germany’s top 40 researchers below 40” (Capital). On March 1st 2016, Prof. Cremers received the Leibniz Award 2016, the biggest award in German academia. He is Managing Director of the TUM Department of Informatics. According to Google Scholar, Prof. Cremers has an h-index of 71 and his papers have been cited 19173 times.

Mobileye and Intel test Autonomous Vehicles in Jerusalem

In March 2017, Intel bought Israeli tech company Mobileye – this was not Intel’s first step towards autonomous driving, but definitely one of the most important ones. Mobileye provides computing capacity and sensors for the area around Jerusalem. The company also uses its expertise to guide several countries on the implementation of autonomous driving. Mobileye’s expertise is undisputable, at the latest since they sent 100 self-driving test vehicles onto the streets of Jerusalem.

The operation’s slogan: If you can do it in Jerusalem, you can do it everywhere, as its traffic is said to be extremely heavy and exhausting for human drivers. Apart from that, Jerusalem is also Mobileye’s company location.

Camera Sensors & True Redundancy

For the moment the testing fleet is only equipped with camera sensors. 8 cameras capture images to detect obstacles and traffic signs and for positioning and mapping. By this the vehicle can develop optimal routes by itself. The procedure of using camera sensors only is called “true redundancy”. The advantage over the use of different kinds of sensors (“real redundancy”) is the small amount of data processed.


Data is processed by AI and converted into corresponding actions. In order to prevent AI from commanding dangerous maneuvers, Mobileye developed the so-called Responsibility-Sensitive-Safety (RSS) – a mathematical model that aligns AI orders with internal protocols. If a certain action or maneuver is not listed in the safety protocols, RSS prevents the execution. Intel has published the standards behind these protocols.

Computing Power

Today, all testing vehicles are equipped with the EyeQ4 chip. However, Mobileye has already unveiled its successor, the EyeQ5 chip, with a computing power ten times as strong as the current chip. The EyeQ5 will be in full mass production by 2020 and was already ordered by BMW for 2021.

First Troubleshooting

Shortly after sending out the test fleet, first issues emerged. One car ran over a red light despite the efforts of a safety driver. At least Mobileye already discovered the cause of the malfunction and solved it: A TV camera interfered with the transponder signal of the traffic lights. Because of the missing signal, the car crossed the road as if there were no traffic lights.

About the author:

David Fluhr is journalist and owner of the digital magazine “Autonomes Fahren & Co”. He is reporting regularly about trends and technologies in the fields Autonomous Driving, HMI, Telematics and Robotics. Link to his site:

Data Sharing with Insurance Companies: Curse or Blessing?

It’s more blessed to share data than to gather – UK startup Oxbotica wants to provide proof of that claim with its unique approach to improve autonomous vehicles.
Oxbotica was founded in UK as a spin-out from Oxford University’s Department of Engineering Science Mobile Robotics Group. Its focus is on sharing data collected by autonomous cars with authorities and insurance companies. What may sound horrifying for data privacy activists is a thought out approach to improve traffic safety. Oxbotica expects to leverage data transparency and commitment by the authorities, which should help accelerating the development of autonomous driving technology.

Testing process and objectives

The startup is running a fleet of 3 autonomous Ford Fusion models to implement their testing activities. The data is transferred via mobile communication and can be accessed by the insurance company XL Catlin, among others. This creates terabytes of data – daily.
Oxbotica decides which kind of data is forwarded to the authorities. The car delivers data about the current position and speed of the car but also data on route complexity. Data is evaluated by their software under the following aspect: What kind of behavior by the car increases safety – and which actions don’t? By analyzing the data, Oxbotica may identify dangerous maneuvers and prevent the cars from executing them.

Volvo is looking for the Critical Mass

The idea to connect vehicles with each other, is also implemented by Volvo in Scandinavia. Connected cars inform each other about potential road hazards or dense traffic. If, for example, a car uses its hazard lights, this is communicated to surrounding vehicles. The more cars participate in the conversation, the safer gets the traffic. However you need a certain amount of cars to provide a certain degree of safety. Volvo is testing in Sweden and Norway to find that critical mass and has invited more partners to participate in their program.

5G let off the leash

The huge mass of data collected on the roads is a tough challenge for the existing network infrastructure. That’s why the industry and politics count on 5G as a communication standard. The 4G LTE successor shall enable a much higher amount of transmitted data and is expected to work more stable with (almost) real-time data transmission.
Recently Germany set up first testing areas in Berlin and Hamburg. In the USA telecommunication giants Verizon, AT&T and T-Mobile plan to commercialize the technology as soon as possible. First practical applications will show if 5G can be the door opener for infotainment and autonomous driving.

About the author:

David Fluhr is journalist and owner of the digital magazine “Autonomes Fahren & Co”. He is reporting regularly about trends and technologies in the fields Autonomous Driving, HMI, Telematics and Robotics. Link to his site:

Ad Punctum Case Study: Autonomous Vehicles will be slower than you think

How does sensor or AI performance affect the potential top speed of an autonomous car? And what is the current maximum speed of an autonomous car taking into account sensor reliability? Ad Punctum conducted a Case Study to carve out answers to these questions and draw conclusion on future mobility.

Case Study Conclusions:

  • Autonomous vehicle top speed is a function of sensor / AI reliability.
  • Based on current sensor performance, autonomous vehicles are unlikely to travel above 70 mph for passenger cars and 60 mph for large trucks.
  • Lower performance enables different engineering trade-offs (cheaper & lighter elements).
  • Vehicles will need to package protect for step changes in sensor performance.

Read the full Case Study and find out why it won’t be as easy as one might think to build fast autonomous vehicles. You can download the whole Case Study here.