Bölümler

  • Till a few weeks back, Cruise was considered one of the big three of autonomous general driving. It was licensed to run a robotaxi service in San Francisco, and my LinkedIn feed was full of folks gushing over the magical experience of being driven around in a car without a driver.

    Then the proverbial shit hit the fan. One of Cruise’s robotaxis got caught in a classic edge case, with a road user who was hit by another vehicle, falling in its path. So far so bad, but then things got worse.

    In the last few weeks, heads have rolled. Cruise has seen the departure of its CEO and other key execs. The company, owned by GM, has decided to get rid of a quarter of its staff, and finds itself in a proper existential crisis.

    How did things come to this, and could they have been avoided?

    To find out, I invited Alex Roy to the AI in Automotive Podcast. Alex is one of the most recognised voices, and an absolute authority in this space. He wears many hats, amongst which is hosting the very popular Autonocast podcast. Previously, Alex worked as an exec at Argo, and was key to their thoughtful approach to operationalising self-driving cars on public roads.

    While my conversation with Alex started talking about Cruise, the theme is not about Cruise alone. Because there is a long tail of edge cases, and things are going to go wrong as this very nascent technology is brought to market. This is also a very new space, and as one might expect, regulation needs to find the right balance between encouraging innovation and guaranteeing safety. The technical scale of the problem can not be underestimated, and it rarely is. But it is the human side of the problem that often does not get the attention it deserves. My chat with Alex underlined for me that getting the human and cultural piece right is going to be as critical to the success of autonomous driving as solving the technical problem.

    With this, we season 4 of the AI in Automotive Podcast is a wrap. I am certain you enjoyed listening to my chat with Alex on season four’s final episode. Please do share the episode with your friends or colleagues, or drop a note on your socials - I always appreciate your support.

    #ai #automotive #mobility #technology #podcast #selfdriving #autonomousdriving #safety #leadership #cruise

    https://www.ai-in-automotive.com/aiia/406/alexroy

    AI in Automotive Podcast

  • Since the beginning of time, cities have been incredibly important to civilization. Today, the World Bank estimates that cities contribute 80% of global GDP. Cities are central to our growth and prosperity, but every single major city in the world is facing challenges ranging from poor air quality to creaking infrastructure.

    So how do cities evolve to prepare for the future? And what role does AI play in this evolution?

    On this unique episode of the AI in Automotive Podcast, I invited the CEOs of two companies that are enabling our cities to become safer, smarter and more sustainable using the power of artificial intelligence.

    Andrew Fleury is the CEO of Luna Systems, a company that is making mobility smarter using their computer vision capabilities. They are putting cameras on micromobility scooters, and using AI to help micromobility operators give their riders a safer experience.

    Chris Tingley runs EVWare, a company whose hardware and software platform makes vehicles safe, connected and intelligent. They do this by bringing high-tech features and functionality to vehicles of all shapes and sizes, including micromobility scooters.

    The modern city generates bucketloads of data. It has been for a while now. Till a few years back, there was limited use for this data. Perhaps the quality of data was suboptimal. Perhaps it was in a form that was not adequately usable to identify patterns and generate insights. Maybe we did not have enough tools and infrastructure to leverage this data.

    All that is changing fast. With the rise of AI and the commoditisation of cloud infrastructure, the data that cities generate carries immense potential in improving decision making and crashing decision time by orders of magnitude. Companies like Luna Systems and EVWare are - in their own way - creating a collaborative ecosystem of partners that can make our cities smarter, safer and more sustainable.

    I hope you enjoy listening to my chat with Andrew and Chris. If you do, go ahead and rate the AI in Automotive Podcast wherever you get your podcasts.

    #ai #automotive #mobility #technology #podcast #machinelearning #urbandesign #cities #infrastructure #vision #micromobility

    AI in Automotive Podcast

  • Eksik bölüm mü var?

    Akışı yenilemek için buraya tıklayın.

  • Vehicle quality issues that lead to recalls and lawsuits cost automotive OEMs tens of billions of dollars in cost and lost revenue each year. Given the explosion of connected vehicle data, one might expect that this data could be leveraged to reduce this cost. Things are rarely that straightforward. Why is that?

    I invited David Hallac, CEO of Viaduct to the AI in Automotive Podcast to find out more. David’s 5-year old startup finds patterns and relationships amongst billions of connected vehicle data points, and delivers two powerful, commercially sound use cases to automotive OEMs. One, it helps automotive OEMs proactively identify and address quality issues, saving hundreds of millions of dollars in warranty costs and recalls. Two, it helps predict failures, call vehicles in for proactive maintenance, and helps bump up up-time - a god-send, especially for fleet customers.

    The big penny drop moment for me during my conversation with David was that connected vehicle applications don’t have to be bold, visible and sexy, delivering massive incremental revenue at near 100% margin. In fact, the connected vehicle applications most likely to succeed in the near-term are those that deliver commercial value today, often by way of substantially reduced costs. Viaduct’s quality management and maintenance prediction use cases check those boxes, and how. Listen to my chat with David to find out more.

    If you enjoyed my chit-chat with David Hallac, please give the AI in Automotive Podcast a solid five stars on Apple Podcasts and Spotify - I am always thankful for your support.

    #ai #automotive #mobility #technology #podcast #machinelearning #unsupervisedlearning #warranty #recalls #maintenance #quality

    AI in Automotive Podcast

  • Autonomous Driving is a big enough paradigm shift. But after years of research and billions of dollars spent trying to get cars to drive themselves, perhaps it is time for a paradigm shift within a paradigm shift. What might this look like?

    Daniel Langkilde, CEO of Kognic joins me on the AI in Automotive Podcast to discuss exactly this. Daniel and I talk about the current approach to autonomy, which involves breaking down a very complex problem into its components - perception, prediction and planning - and its limitations. Based on a better understanding of how humans actually go about accomplishing the task of driving, we ask if perhaps it is time to take a different approach to delivering autonomy at scale. We discuss a key component of this approach - the world model - or the ‘common sense’ that a machine must be equipped with to make sense of the complex world around it. Daniel also talks about alignment, what it means to steer a system towards accomplishing its stated goal, and its relevance to autonomous driving.

    I am convinced that we are far from done with solving autonomy. On the contrary, I feel there is a lot of unexplored territory yet, which can dramatically change how we approach this opportunity. I hope my chat with Daniel gave you a sneak peek into what the inception of paradigm shifts looks like, and what it means for the future of autonomous driving. If you enjoyed listening to this episode of the AI in Automotive Podcast, do share it with a friend or colleague, and rate it wherever you get your podcasts.

    AI in Automotive Podcast

  • Radars have been evolving at a really rapid clip, helped in no small part by innovative companies like Arbe Robotics. On today’s episode of the AI in Automotive Podcast, I am talking to Ben Rathaus, VP of AI and Perception at Arbe.

    Ben talks us through the history of radars, and how and why they found their way onto cars. We discuss how Arbe’s silicon and software is creating an order of magnitude improvement in the resolution and performance of automotive grade radars. We talk about the composition of radars, and their output - a mapping of the free space around the vehicle - an absolutely key building block of AD and ADAS algorithms.

    Ben and I started at cosmology and ended up at what the humble radar might look like in the future! Just another fascinating conversation that allowed me to understand the past and future of radars a lot better, as well as the very important role they play in making our cars smarter and safer. I think of them as the invisible, unsung heroes - working away diligently in the background, making everything around them work a lot better.

    If you have ever wondered whether future radars can wholly replace cameras on the car… well, you will find out at the end of my chat with Ben. So go have a listen, and if you like what you hear, do share the AI in Automotive Podcast with a friend or colleague.

    #ai #automotive #mobility #technology #podcast #radar #sensors #sensorfusion

    AI in Automotive Podcast

  • Connected cars have been around for a while, but in-car services have strangely not really taken off. I learnt in my chat with Todd Thomas, Chief Revenue Officer at AiDEN Auto that there’s a good reason. Or four. Todd spoke to me about the evolution of the modern car to become a more mature connected device from its current state, and in the process we unearthed some real gems of insight.

    After today’s chat with Todd, I have a much better understanding of what connected vehicles 2.0 might look like, and why we may just be at the cusp of an explosion in true in-car services, many of them powered by AI. I am also convinced that the way we interact with our cars is going to undergo a significant shift, substantially changing our relationship with our cars. It’s going to be an exciting next few years in this space. Subscribe to the AI in Automotive Podcast to stay in touch with the technologies shaping our industry’s future.

    AI in Automotive Podcast

  • Designing a car is hard. 🚙 🏎️ 🏁

    It involves solving a series of multi-dimensional problems under a variety of constraints. I am oversimplifying here, but a problem with 8 dimensions and 5 values for each dimension will have 390,625 combinations to be experimented with. Real-world problems are usually way more complex, and testing every unique combination of inputs in an experiment is often not viable. So what is the solution, and what role does AI play?

    Gary Brotman, CEO of Secondmind, joined me on the AI in Automotive Podcast to share the answer to that question, and talk to us about the role of AI in accelerating the automotive design process. Think of Secondmind’s platform as a co-pilot for automotive engineers. It uses the power of AI to help them design the right experiments to solve complex multi-dimensional problems of the sort they encounter in their day jobs. This focuses their limited resources on a subset of the problem space, and gets them to the optimal solution in the most time-efficient and resource-efficient manner, accelerating the design process.

    This is a fascinating conversation with one of the most experienced and knowledgeable blokes out there when it comes to AI in Automotive. I hope you enjoy my chat with Gary as much as I did. And if you do, go ahead and rate the AI in Automotive Podcast wherever you listen to podcasts.

    #ai #automotive #mobility #technology #podcast #engineering #designofexperiments #machinelearning

    https://www.ai-in-automotive.com/aiia/305/garybrotman



    AI in Automotive Podcast

  • Have you ever thought about how autonomy came to be? What were the origins of the idea of autonomous vehicles? Have you ever wondered what is next for autonomy, and how will this space evolve in the future?

    Well, wonder no more. On this episode of the AI in Automotive Podcast, I am delighted to host an OG member of the autonomy gang, whose connection with autonomous driving goes way back to the heady days of the DARPA Grand challenge almost two decades ago.

    Bibhrajit Halder, Founder & CEO of SafeAI joins me to share how the idea of autonomy originated, how it evolved from 1.0 to 2.0, and how it might evolve in the future to 3.0 and 4.0. We also talk about the application of autonomous driving in the mining and construction industries, and how SafeAI is using AI to seriously disrupt the mining industry.

    I never thought I would be discussing mine economics on this show, but we did, and it was so much fun! I hope you enjoy my chat with Bibhrajit, and if you do, why don’t you go ahead and share the AI in Automotive Podcast with a friend or colleague who carries an interest in this space.

    AI in Automotive Podcast

  • In the world of autonomous driving, high-compute GPUs are all the rage. So I was incredibly delighted to learn of a company that is taking a very counter-intuitive approach to the perception stack. These guys have identified a number of use cases that do not require the 100% accuracy that autonomous driving demands, and are focused on making their vision perception stack work on smartphones you can buy for a hundred dollars.

    In this episode of the AI in Automotive Podcast, I am pleased to host Jorit Schmelzle, co-founder and Chief Product Officer of Peregrine Technologies. This is such a wide-ranging conversation, covering topics from sensor fusion, to understanding the automotive perception stack, and how ordinary smartphones can deliver interesting use-cases for the vision and perception stack.

    While the world of autonomy started with the desire to make a fully-autonomous vehicle that would then power a fleet of robotaxis, the way the space has evolved has created some very compelling use cases, even if full autonomy is still a few years away. Peregrine Technologies is at the forefront of delivering real value today, while also inventing the technology of tomorrow.

    I am sure you will enjoy my chat with Jorit, so can I ask you to please share the AI in Automotive Podcast with your friends and colleagues who carry an interest in AI, or the automotive industry.

    #ai #automotive #mobility #technology #podcast #perception #selfdriving #autonomousdriving



    AI in Automotive Podcast

  • LiDARs are an important piece of the autonomous driving and ADAS puzzle. While they boast impressive resolution and frame rates, they have also built a reputation for being big, bulky and expensive. Can there be another way?

    Paul Drysch, CEO of PreAct Technologies certainly thinks so. PreAct has been working behind the scenes for a number of years to develop their short-range LiDAR which aims to deliver all the functionality of a LiDAR at short distances while addressing the biggest drawback of the technology - its cost. Their software-definable LiDAR is to the world of LiDARs what the software-defined vehicle is to traditional cars.

    Join Paul and me on this episode of the AI in Automotive Podcast as Paul gives us a crash course on LiDARs, their types and flavours. We also talk about what the sensor suite in future cars might look like, and where PreAct’s low-cost, short-range LiDAR fits in. Paul believes LiDARs’ time in automotive is yet to come. I am so excited about how technologies like PreAct’s can expand LiDARS’ use cases, and accelerate their mainstream adoption.

    https://www.ai-in-automotive.com/aiia/302/pauldrysch

    AI in Automotive Podcast

  • If you, like me, grew up using Windows 3.1, then you are familiar with the dreaded Blue Screen of Death. One of the reasons for that blue screen - in simple terms - was that the computer had run out of resources to run all the tasks that were being demanded of it.

    I have news for you - that reality may be coming to your car sooner than you think. We are putting increasing amounts of computational demands on the modern vehicle. The increasing number of sensors, the increasing resolution of many of these sensors, and computationally intense AD/ADAS tasks mean that current EE architectures and chips are running out of steam. It is believed that the modern car requires ten times the processing capability offered by current architectures.

    So, how does the industry stop cars from freezing up under the burden of heavy computation tasks?

    Enter Marc Bolitho, one of the most knowledgeable people in the automotive semiconductor space. Marc is the CEO of Recogni, a company that has created an incredibly disruptive processor that seems to have achieved the performance-power consumption holy grail. I had a remarkably enlightening conversation with Marc, and we spoke about the fundamentals and design process of EE architectures, the limitations of current processors, and how Recogni’s product promises to meet the computational demands of a modern car without breaking a sweat.

    If TOPS means nothing to you, and EE is a telecom operator in the UK, then you have to listen to my chat with Marc. Season 3 is full of fantastic conversations like this one, so stay tuned, and do spread the word about the AI in Automotive Podcast, a platform for dialogues on how AI is shaping the future of the automotive and mobility industries.

    AI in Automotive Podcast

  • There has been a lot of talk recently about vision versus LiDARs and RADARs. I hosted Leaf Jiang, CEO of a company called NODAR to learn more about the advantages and limitations of each technology, and how NODAR's own technology overcomes them. Their name is a nice play on the fact that their product is not RADAR or LiDAR, but in fact, uses vision to achieve resolution and depth perception better than either of them.

    Instead of relying on machine learning models to interpret the feed from the cameras, NODAR’s system, consisting of a pair of cameras, triangulates distance measures to points in the scene by measuring angles to the point from each of the cameras. There’s a lot of complicated geometry involved, which, sadly for the nerds amongst you, we will not go into.

    All that said, NODAR’s colour-coded point clouds can be an incredibly powerful source of data for machine learning models that can then do everything from scene inference to path planning, possibly computationally more efficiently.

    I am sure you will love listening to my chat with Leaf on this episode of the AI in Automotive Podcast.

    AI in Automotive Podcast

  • Fancy waking up one fine day to find that your car, much like your smartphone, now has a better interface on the infotainment touchscreen, or that annoying niggle that was draining your battery has magically been resolved?

    The essence of software-defined vehicles is their ability to keep getting better over time. A lot needs to happen behind the scenes, for this to work. How is data from a fleet of vehicles moved into the cloud? How do engineers use this data to identify patterns and improvements? And how are improvements to the software pushed back to the fleet?

    To learn more about some of these themes, I invited Hemant Sikaria to the AI in Automotive Podcast. Hemant is the CEO and Co-founder of Sibros, a software company headquartered in Silicon Valley. Sibros helps automotive OEMs and mobility companies power the connected vehicle ecosystem with their Deep Connected Platform.

    This is an incredible discussion that will help you learn more about the foundation that enables using software and AI to make our vehicles better over time. If you like my conversation with Hemant, do share the AI in Automotive Podcast with a friend or colleague, and drop us a rating wherever you get your podcasts.

    #ai #automotive #mobility #technology #podcast #softwaredefinedvehicle

    https://www.ai-in-automotive.com/aiia/206/hemantsikaria

    AI in Automotive Podcast

  • Buttons and physical interfaces disappearing from your car is now an inevitability. That said, we certainly can’t be fumbling with a touchscreen to change the fan speed or switch the radio station. There has to be a better way. That’s what makes me very bullish about voice as the primary human-machine interface in the modern car.

    We have all gotten used to speaking to our smartphones and smart speakers, and getting a lot done - typing out an email, playing your favourite 60s rock album and ordering toilet paper. The voice experience in the car, however, lags far, far behind.

    SoundHound is here to change that. In this episode of the AI in Automotive Podcast, I am speaking to Matt Anderson, SoundHound’s Director of Business Development. Matt lays out exactly how SoundHound’s speech-to-meaning technology is able to understand what you are saying, interpret your intent and respond to you intelligently, whilst tapping into a variety of domains. In addition to the technology itself, we also talk about a brand’s voice identity, which I found incredibly fascinating.

    I am excited about what the future holds in this space, and after listening to my conversation with Matt, I am sure you will be too. And when that happens, do share this episode of the AI in Automotive Podcast with a friend or colleague.

    AI in Automotive Podcast

  • Electric Vehicles might look and drive like normal cars, but scratch beneath the surface and you will realise that they are fundamentally different at an architectural level.

    With the modern car being so much more than merely its mechanicals, I learnt that digital architecture in cars is a thing. The hardware - system on chip, or SoCs, processors and screens, combined with the software - the operating system, middleware and applications bring to life so many elements of the modern car that we take for granted.

    I wanted to learn more about how these elements of the vehicle’s digital architecture work together, and so I invited Anshuman Saxena on the AI in Automotive Podcast. Anshuman is the Head of ADAS/Autonomous Driving at Qualcomm Technologies. He opens up the software-defined vehicle’s digital architecture for us, and introduces us to the Snapdragon Ride Platform. We talk about the platform’s potential to accelerate the development and deployment of autonomous driving technologies, and the direction this space is headed in, in the near future.

    If you like my conversation with Anshuman, do head over to the AI in Automotive Podcast on Spotify and Apple Podcasts and give us a thumbs up. Do share our show with your friends and colleagues who are excited by all things automotive.

    AI in Automotive Podcast

  • Connected vehicles to me, for the longest time meant a car that has a SIM card and is connected to the Internet as an IoT device. But connected vehicles are, and should be, so much more than that. In a world where vehicles are able to communicate with each other, with other participants on the road and with infrastructure, the possibilities that can unlock are endless.

    So what is coming in the way of that happening? Technology? Policy? Universal standards? What role can startups, private corporations and government bodies play in accelerating the evolution to a world of truly connected, V2X equipped vehicles.

    We invited Wejo’s Sarah Larner to get her perspective on V2V, V2X and all things connected car. Sarah helps us make sense of these topics, and shares with us how Wejo’s vast dataset of 20 trillion data points forms the foundation of the automobile’s future. We discuss how the impending explosion of V2X data will help automotive AI applications go from being reactive to proactive to predictive.

    If you like my conversation with Sarah, do head over to the AI in Automotive Podcast on Spotify and Apple Podcasts and give us a thumbs up. Do share our show with your friends and colleagues who are excited by all things automotive.

    AI in Automotive Podcast

  • What would you define as automotive? Sure there are the cars, motorbikes, vans, trucks and so on. But what about lawn mowers?

    We are stretching the definition of automotive in this edition of the AI in Automotive Podcast. And you will see why. Today we are speaking to Andres Milioto, a Senior Vision Engineer at Scythe Robotics. This company has developed an all-electric, fully-autonomous commercial mower, which, needless to say, uses machine learning extensively, primarily for perception.

    The way the Scythe team has identified this very unique - but very large problem - and solved it using machine learning - I find that really cool. They are well on their way to solving two of the biggest problems this very traditional industry is facing - a perennial labour shortage, and pollution.

    I was keen to bring Andres on the show to gain a deeper understanding of the similarities, and most importantly, the differences between Scythe Robotics’ application of autonomous technologies and what we might consider more conventional autonomous driving applications.

    I hope you enjoy this very unique episode of the AI in Automotive Podcast.



    AI in Automotive Podcast

  • Over ten years ago, an idea that captured our imagination was that software is eating the world. Well, software may not have eaten the automobile, but it is certainly transforming it in very profound ways.

    The modern car is incredibly complex - not just from a hardware perspective, but also from a software point of view. Today’s software-defined vehicle is made up of numerous subsystems, each run by its own code; hundreds of sensors generating tonnes of data every second; and a number of vehicle functions now automated to a large degree. So how does this all work together?

    We invited Sarah Tatsis, SVP of IVY Platform Development at BlackBerry to help us understand exactly how. In this episode of the AI in Automotive Podcast, Sarah unpacks the automotive software stack for us - from the foundational operating system software that helps various subsystems communicate with each other, to middleware that uses machine learning models to turn sensor data into actions, to application software on top that will help your car do amazing things like pay for your coffee in the future.

    I thought my cutting-edge new car was smart, but what stuck out most for me through this conversation is the huge amount of headroom available yet for our cars to get smarter and more connected to the world around us.

    I hope you enjoy my conversation with Sarah. If you do, please do share the AI in Automotive Podcast with your colleagues who might enjoy it as well.

    https://www.ai-in-automotive.com/aiia/201/sarahtatsis

    AI in Automotive Podcast

  • It has been just over three years since I started the AI in Automotive Podcast. With all of the interesting ways in which artificial intelligence and machine learning are shaping the future of the automotive and mobility industries, I am happy to have hosted quality conversations exploring this theme further.

    My intention with the AI in Automotive Podcast was - and continues to be - to invite interesting people doing interesting things with AI in the automotive industry, and engage them in thought-provoking and enlightening dialogue.

    The twenty episodes we have published since have had over five thousand downloads, and I am incredibly grateful to the regulars amongst you who have kept us going.

    A lot has changed since 2019, and I imagine the pace of change will only accelerate in the coming years. With this, I am aiming to publish more regularly. We will soon release the long-overdue Season 02 of the AI in Automotive Podcast. 6 episodes, published once every alternate week. The same conversations, only better.

    I might be slightly biased here, but I think if you are associated with the automotive and mobility industries in any capacity, then this podcast is essential listening for you to stay connected with the fascinating changes we are seeing around us.

    So, do me a favour - go ahead and share a link to the AI in Automotive Podcast with your friends and colleagues. You can find us wherever you get your podcasts.

    Continue to tune in, and hear from some of the smartest people in our industry.

    AI in Automotive Podcast

  • A number of companies are trying to crack the autonomous driving puzzle, and a variety of approaches have evolved. Some companies are taking a software-first approach, building an AD software stack that can work on any hardware environment. Others are taking a hardware-first approach, creating a sensor and hardware environment that can adopt any software stack. Others still are creating a ‘walled-garden’ with software and hardware designed in close conjunction, allowing each to work only with the other for a specific use case.

    Imperium Drive, a UK-based autonomous driving startup, has a radically different take on the autonomous driving problem. Imperium Drive believes that a ‘human-in-the-loop’ is a critical stop on the journey to full autonomous driving. The company is focused on bringing commercially and operationally viable products to the market, even as it pursues its ultimate goal of making autonomy a reality.

    We caught up with Koosha Kaveh, CEO of Imperium Drive on this episode of the AI in Automotive Podcast. Koosha shares his view of the evolution of AD, and progress on the autonomy journey. He also introduces us to Fetchcar, their super interesting driverless, human-in-the-loop car rental service.

    I hope you enjoy listening to this episode, and if you do, please do share it with your network on LinkedIn and rate our show wherever you get your podcasts.

    https://www.ai-in-automotive.com/aiia/120/kooshakaveh

    AI in Automotive Podcast