Mobileye Global Inc. (Mobileye) engages in the development and deployment of advanced driver assistance systems (ADAS) and autonomous driving technologies and solutions.
The company’s portfolio of solutions is built upon a comprehensive suite of purpose-built software and hardware technologies designed to provide the capabilities needed to make the future of ADAS and autonomous driving a reality. These technologies can be harnessed to deliver mission-critical capabilities at the edge and in the...
Mobileye Global Inc. (Mobileye) engages in the development and deployment of advanced driver assistance systems (ADAS) and autonomous driving technologies and solutions.
The company’s portfolio of solutions is built upon a comprehensive suite of purpose-built software and hardware technologies designed to provide the capabilities needed to make the future of ADAS and autonomous driving a reality. These technologies can be harnessed to deliver mission-critical capabilities at the edge and in the cloud, advancing the safety of road users, and revolutionizing the driving experience and the movement of people and goods globally.
The company’s industry-leading technology platform, built upon over 20 years of research, development, data collection and validation, and purpose-built software and hardware design, gives the company a differentiated ability to not only deliver excellent safety ratings and maintain a leadership position with its ADAS solutions, but also to make the mass deployment of autonomous driving solutions a reality. The breadth of the company’s solutions, combined with its global customer base, represents a significant market opportunity for the company. The company’s platform is efficient and modular by design, enabling its customers to productize the company’s most advanced solutions and then leverage those investments to launch even more advanced systems in a modular and incremental manner. The company’s solutions are also highly customizable, which allows its customers to benefit from the core technology supporting its advanced solutions while also augmenting and differentiating their offerings.
The company derives substantially all of its revenue from its commercially deployed ADAS solutions.
As of December 28, 2024, the company’s solutions have been installed in approximately 1,200 vehicle models (including local country, year, and other vehicle model variations), and its System-on-Chips (SoCs) have been deployed in over 200 million vehicles. The company is actively working with more than 50 original equipment manufacturers (OEMs) worldwide on the implementation of its ADAS solutions. For the year ended December 28, 2024, the company shipped approximately 29.0 million of its EyeQ SoC and SuperVision systems, of which the substantial majority were EyeQ SoCs.
Technology Platform is Built to Enable the Full-Stack of Autonomous Solutions
The company’s technology platform, which includes its software and hardware intellectual property, leverages its experience as a technology leader for sensing and perception solutions for the automotive industry and its focused efforts to build highly scalable and autonomous solutions. The company’s technologies are foundational to the development and deployment of its ADAS capabilities and consumer AV. The company’s platform is built on five fundamental pillars:
Computer Vision Processing: Mobileye’s history as a cutting-edge deployer of AI-based solutions in the real world starts with the company’s expertise in computer vision processing. ADAS solutions are responsible for saving lives and must meet very high-performance metrics with extreme levels of efficiency, as well as pass increasing oversight from regulatory bodies. The precision requirements for advanced solutions in the Premium ADAS and AV segments are even more exacting. The company is a technology leader for computer vision technology for ADAS, largely through front camera solutions, and it has continuously enhanced its leadership position through its ability to meet the extreme performance, accuracy, and cost metrics of its OEM customers. In recent years, the company has expanded its capability to enable creation of a 360-degree world-view through the processing of multiple cameras placed around the vehicle to support its portfolio of advanced solutions. The company’s products primarily use monocular camera processing that works accurately alone, or together with radar and lidar for redundancy. The software supporting camera processing is diverse, including end-to-end neural network processing (both 2- and 3-dimensional) and model-based techniques, among other approaches. This compound AI system structure leads to internal redundancies within the camera-based perception system that enhances precision through design. The company has been responsible for many ‘industry first’ launches using monocular vision processing, and have enhanced its computer vision capabilities over time to include multiple cameras, such as the trifocal camera configuration (three cameras with different fields of view placed side-by-side facing forward), which has been in series production since 2018, and the 11-camera configuration on its Mobileye SuperVision solution, which was launched in late 2021.
Road Experience Management: The company’s Road Experience Management (REM) technology generates high-precision maps to support advanced ADAS and autonomous vehicle systems from crowd-sourced data that is uploaded and analyzed in the cloud from REM-equipped production ADAS solutions deployed on vehicles on the road. REM is a cloud-based system that leverages the broad installed-base of REM-equipped vehicles to build Mobileye Roadbook, its crowd-sourced, high-precision definition maps of roads from around the world. The company’s REM mapping system harvests small packets of Road Segment Data from various vehicle models produced by its partner OEMs that are equipped with special processing software that extracts only the relevant information necessary to support increasing levels of ADAS and autonomous driving. The Road Segment Data is uploaded to the cloud where its software automatically creates and updates a detailed and accurate model of the road. The company’s REM mapping system seamlessly creates high-precision maps from such Road Segment Data in the cloud at centimeter-level detail, which are then delivered to the edge and integrated with its computer vision engines to provide vehicles with real-time intelligence, including situational awareness, context, and foresight. Mobileye Roadbook was designed to provide the driving solution with a pre-aggregated representation of relevant static and slowly changing elements of the environment (road geometry, boundaries, and semantics) and temporary events, such as construction zones and road debris, at a high refresh rate.
Compound Artificial Intelligence Systems, including True Redundancy: The company’s Compound AI structure supports precision, recall, and overall efficiency by design. While recent advancements in transformer-based architectures and Generative AI create efficiencies in learning-based systems, which Mobileye makes full use of, it also introduces shortfalls such as lack of abstraction, shortcut-learning problem, alignment issues (i.e., learning from common but incorrect data), and the long-tail problem (i.e., identifying and fixing edge cases one by one). Inevitably this approach is intended to drive high recall at the expense of precision. This inherently introduces significant risk when it comes to the complexity of real-world driving scenarios. The company’s solution, which aligns with the latest developments in Generative AI even in non safety-critical applications includes insertion of abstractions and an architecture with multiple levels of redundancies that support a quadratic improvement in precision through design. Insertion of abstractions can successfully convert large sets of out-of-distribution edge cases to in-distribution without the need for iterative network re-training with more and more data. Additionally, the company’s structure includes redundancies within the computer vision stack, the fusion of mapping with real-time perception, the fusion of decomposable and end-to-end architectures, and True Redundancy - the fusion of independent world-views produced by separate vision and radar/lidar-based subsystems. When it comes to safety, the company’s multi-faceted high-level fusion structure is governed by a Primary / Guardian / Fallback methodology (PGF) which can handle majority-rule, as well as non-binary discrepancies.
Next Generation Imaging Radars: A solution targeted to complement the camera-based system with a sensor that has nearly fully-independent failure modes, supports high precision and to reduce the need for multiple expensive lidar sensors, supports cost-efficiency, a major component of recall. The in-house development of imaging radar is a key enabler of its goal of building a cost-effective fully autonomous driving-system. The company’s radar is expected to deliver rich point-cloud models like those customary of lidar, with far higher resolution and significantly more dynamic range than traditional radar. During 2024, these goals were validated through widespread testing of its B-sample hardware by a number of OEMs. These radars differ from legacy radar and other imaging radar development as, backed by advanced processing algorithms, they can enable an independent sensing state with independent failure modes than the camera-based system which supports a quadratic improvement in mean-time-between failure. The company’s choice to focus on the evolution of the radar modality is also related to its cost structure, which is significantly below lidar sensors. The company has selected multiple manufacturers for this solution which is approaching start of production.
The company’s Family of Purpose-Built EyeQ SoCs: Fundamental to the company’s leadership position in ADAS and its ambitions to develop the most cost-efficient, high-performing AV solutions, its EyeQ SoCs incorporate a set of proprietary compute-acceleration models to enhance the accuracy, quality, and functional safety of its perception solutions, while minimizing the power consumption to address the requirements of the automotive market. The EyeQ family design also enables a scalable Electronic Control Unit (ECU) architecture, thereby supporting a variety of ADAS and autonomous vehicle solution architectures that meet the functional safety requirements of its customers. These solutions range from base, windshield mounted ECUs to multi-SoC central compute ECUs supported by EyeQ5 High and in the near future EyeQ6 High, which can be deployed in a scalable way to support a full suite of Premium ADAS and AV solutions.
Efficiency of the company's inference silicon design is a core enabler of the overall efficiency of its system and is critical in applications like automotive, which highly value packaging size and power consumption. Designing an efficient silicon architecture requires optimizing the competing factors of efficiency and flexibility. The company accomplishes this through the development of a variety of accelerators, each of which is designed to perform specific tasks that either most favor efficiency, flexibility, or a combination of both. Successful design has led to a 10-times improvement in frames-per-second processing for EyeQ6 High, as compared to EyeQ5 High, despite only double the headline processing power (i.e., TOPS) and 25% higher power consumption. Based on competitive benchmarking, the EyeQ6 High is significantly more efficient than more general-purpose SoCs that have much higher headline processing power, in terms of frames-per-second, latency, and cost.
The company's EyeQ5 SoCs and subsequent generations are increasingly customizable by its OEM customers, supported by its Driving Experience Platform (DXP). DXP is a software platform that enables automakers to develop and customize the driving experience (i.e., the OEM-unique aspect of a vehicle’s automated driving features) while utilizing Mobileye’s proven core technology perception and driving policy software (i.e., the objective, universal aspects of a vehicle’s automated driving features). This new application programming interface supports the company's customers’ desire to create unique products from its technology while also accelerating time-to-market and reducing overall execution risk.
The Autonomous Vehicle Revolution
Mobileye is leading the evolution from ADAS to autonomous driving.
Vehicle autonomy can be viewed as a spectrum that uses the same technology building blocks to power the full span of driver assist functions, ranging from those available in hundreds of car models today, through full autonomy powering robotaxis, and eventually, personal autonomous vehicles. The automotive industry breaks down this spectrum into what are known as SAE Levels 1, 2, 3, 4 and 5. The company has developed its own, more user-friendly taxonomy. Each level of its taxonomy is further defined and supported by the particular operational design domain (ODD) for which it was designed.
First, the company refers to basic driver assist features, such as automatic emergency braking or lane keeping assist, together with longitudinal control, such as adaptive cruise control, as eyes-on/hands-on. The driver is still responsible for the overall task of driving, while the system supports the human driver.
Second, ‘eyes-on/hands-off’. This refers to premium driver assist functions adding additional safety and comfort functionality. This functionality allows the driver to experience hands-free driving while requiring the driver’s full attention and eyes on the road.
Third, ‘eyes-off/hands-off’. The system controls the driving function within a specified ODD, such as highway driving, without the need for the human driver to monitor driving. If the ODD is exceeded and the driver does not reassume control, the system is capable of performing a Minimum Risk Maneuver (MRM) and safely stop at the roadside.
Fourth, ‘no driver’. When no human driver is present, e.g., a robotaxi, the system will perform a MRM when needed, including coming to a full stop, and can also contact a teleoperator for decision support, such as re-routing and rules decisions.
The company’s ADAS solutions, which have been deployed in over 200 million vehicles, are important building blocks for these more advanced autonomous systems.
Solutions
The company has utilized the technology pillars detailed above to build a robust portfolio of end-to-end ADAS and autonomous driving solutions that provide the capabilities needed for the future of autonomous driving, leveraging a comprehensive suite of purpose-built software and hardware technologies. The company pioneered ‘base’ ADAS features to meet global regulatory requirements and safety ratings with its Base ADAS solution, and it has since created new categories of ADAS with its Cloud-Enhanced ADAS, Mobileye Surround ADAS, and SuperVision offerings. Additionally, the company has designed a full set of eyes-off/hands-off solutions at a wide variety of price points and a spectrum of functionalities and ODDs.
End-to-End ADAS and AV Solutions
Mobileye Base ADAS
Mobileye’s Base ADAS, powered by the company’s purpose built, on-windshield EyeQ SoC devices and its expertise in computer vision, brings its core ADAS solutions to millions of vehicles on the road today and is foundational to its spectrum of ADAS and autonomous vehicle solutions. The company’s EyeQ SoC provides drivers with basic safety features covered by front-facing sensing, such as collision warning, lane departure warnings, pedestrian and cyclist collision warning, headway monitoring and warning, speed limit indicator, blind spot detection, and many more. The company’s software algorithms and purpose-built hardware are designed to provide the driver with accurate and reliable driver assist solutions, promoting road safety.
Cloud-Enhanced ADAS
Mobileye’s Cloud-Enhanced ADAS leverages crowdsourced data from millions of REM-equipped vehicles around the globe every day, providing high-level accuracy localization via continuously updated information about the driving scene. Enhancing the existing single-camera system with crowdsourced data offers comprehensive in-path assist functionality that enables better performance and compliance even in complex or challenging circumstances. Relying on data from prior human driving activity to anticipate and adapt, the company’s Cloud-Enhanced ADAS solution provides a safer, smoother, and more natural driving experience – marking a software defined leap in ADAS performance with no need for additional hardware.
Mobileye Surround ADAS
Building on the company’s ADAS expertise and the core of its single-camera Cloud-Enhanced ADAS system, it offers through its Mobileye Surround ADAS system the ability to meet expanded late-decade active safety requirements through the utilization, analysis, and processing of additional surround perception sensors. Mobileye Surround ADAS utilizes the SuperVision software stack, including its RSS policy model, and is powered by a Mobileye ECU with one EyeQ6 High SoC, which processes data from the customer’s third party sensor suite featuring up to six cameras and up to five radars. Such cameras generally consist of two long-range cameras in the front and rear, while leveraging data from four short-range surround vision cameras that are already equipped on many production vehicles today for parking visualization purposes. Additionally, Mobileye Surround ADAS offers eyes-on/hands-off functionality for highway ODDs by adding features like automatic lane change, front and rear collision avoidance, traffic jam assist, and a Highway Pilot function up to 130 kilometers per hour with the fidelity of a multi-camera and multi-radar sensor suite. This system also includes DXP support, which enables customers to customize the driving experience while benefiting from the company’s industry-leading technology platform.
Mobileye SuperVision
Mobileye SuperVision, the company’s eyes-on/hands-off Premium Driver Assist offering, is its most advanced driver assist system on the market and the bridge to consumer autonomous vehicles. It is designed to handle standard driving functions across various road types, offering ‘hands-off’ navigation capabilities under certain ODDs, while still requiring the driver to pay full attention and keep eyes on the road. Derived from the company’s autonomous vehicle research and development, Mobileye SuperVision leverages cloud-based enhancements, such as REM, a number of algorithmic and architectural redundancies, and its RSS policy model. The system utilizes 360-degree surround sensing with 11 third-party cameras powered (plus optional radar) processes by a turnkey ECU with two EyeQ5 or, in the future, two EyeQ6 SoCs. Furthermore, in addition to supervised point-to-point assisted driving, Mobileye SuperVision is capable of changing lanes, managing priorities, and turning in intersections, as well as engaging in automated parking, preventative (i.e., evasive) steering and braking, and other Driver Assist features. This solution is further supported by OTA updates. The 11 third-party cameras (seven long range cameras and four short-range surround vision cameras) provide full surround coverage and consist of 120-degree and 28-degree cameras in the front, four 100-degree corner cameras (two front-facing and two rear-facing), a 60-degree rear camera and four wide-view 195-degree short-range cameras mounted on the side mirrors and front and rear bumpers. The mapping is powered by REM, and integrated with computer vision perception, to create a 360-degree environmental model (subject to the availability of map data) and RSS constrains the driving decisions to be compliant with an underlying formally proven model for safe driving decisions. This offering also includes DXP, which will enable customers to control the driving experience while benefiting from the company’s industry-leading technology platform.
Importantly, the company’s SuperVision technology also serves as a bridge, or foundational technology, for Mobileye and its customers to develop a spectrum of eyes-off/hands-off solutions with expanding ODDs. In other words, an OEM that adopts and validates SuperVision is taking a significant step towards consumer AV as SuperVision serves as a validated baseline, including a common primary ECU board, which can be leveraged to add eyes-off functionality under an increasing set of operating conditions in a modular way.
The first series production launch of this offering occurred in 2021 as Geely Group launched Mobileye SuperVision in its ZEEKR premium electric vehicle brand. Through the end of 2024, over 300,000 SuperVision systems were delivered to ZEEKR and other brands.
Mobileye Chauffeur and Mobileye Drive
Mobileye Chauffeur is the company’s geographically scalable eyes-off/hands-off solution for consumer vehicles in a gradually expanding ODD, combining computer vision technology with surround imaging radars and front lidar. The first generation solution will be based on three EyeQ6 High SoCs, deployed with a primary board, including two EyeQ6 High SoCs supporting full surround computer vision perception and mapping and a secondary board with an additional EyeQ6 High SoC supporting radar / lidar perception and its Compound AI fusion architecture. The primary board is common to the company’s SuperVision solution which reduces the OEM’s validation burden, and the dual-board setup provides functional safety redundancy. The system will provide 360-degrees of coverage through two independent and redundant sensing subsystems, along with REM maps, RSS, and its PGF architecture, to support optimized scalability and safety. By using the Mobileye SuperVision eyes-on/hands-off system as a basis for Mobileye Chauffeur, the company allows for an incremental and modular eyes-off transition from one ODD to the next. This can be achieved by adding more active sensors for redundancy and more compute power to the already validated and road-tested Mobileye SuperVision. This approach provides the company's customers with a viable, modular, and incremental path toward useful and safe consumer AV solutions.
Mobileye Drive is the company's fleet-focused end-to-end self-driving system that enables automakers, public transportation companies, and transportation network operators to offer a no-driver solution for robotaxis, ride-pooling, public transport, and goods delivery. This eyes-off/hands-off/no driver solution will build upon the core autonomous driving technologies found in Mobileye Chauffeur and will deliver driving functions without the need for any in-vehicle human intervention by adding teleoperability and minimizing cases where human input would be required. The company's overall turnkey self-driving solution offers an advanced ODD that can turn various vehicle configurations and solutions autonomous. Mobileye Drive is already being integrated and is in the development, testing, and validation stages in autonomous public transit, autonomous goods delivery, and autonomous mobility-as-a-service (AMaaS) across industries and around the globe.
Models for AV Adoption
The company expects that autonomous vehicle technology will eventually be accessed by consumers through shared-vehicle AMaaS networks, as well as in consumer-owned and operated autonomous vehicles. It is the company's view that, to reach the full potential of autonomous driving over the long term, the technology solutions that enable these separate markets should converge over time, and that belief is reflected in its strategy. Autonomous driving has the potential to dramatically increase the proliferation of shared mobility, creating greater utilization of what is a significantly underutilized asset, the car.
Automation would allow the individual to be significantly more productive during their commute or other time spent in the car, given that the vehicle could operate eyes-off/hands-off in an increasingly wide ODD. Providing consumers with access to affordable autonomous vehicles can create significant value by decreasing time spent focused on the driving function and increasing safety.
As autonomous driving technology advances, a number of new transportation use cases are expected to emerge around the type of vehicle ownership, what is transported, and where and when the vehicle can operate. As fleet operators increase network scale and availability of vehicles, the value of the platform to the user base will rise.
Growth Strategies
Key levers of the company’s growth strategy are: benefit from regulatory and safety rating changes promoting base ADAS; capitalize on cloud-enhanced ADAS features; further enhance and drive adoption of the company’s premium driver assist solutions; innovate and commercialize its next-generation autonomous driving solutions; utilize the company’s flexible platform to expand its collaboration with its OEM customers; capitalize on its active sensor technology; accelerate its roadmap of next generation proprietary EYEQ socs ; utilize the company’s substantial and growing dataset to continuously improve the intelligence and robustness of its solutions; establish its eyes-off/hands-off autonomous and AMAAS solutions; and benefit from opportunities in large emerging markets.
Customers
The company’s customers include leading OEMs, which it primarily sells to through Tier 1 automotive suppliers that implement its product into automotive vehicles, as well as fleet owners and operators.
OEMs
The company’s market position has remained strong across a broad set of customer relationships for many years. The company is actively working with more than 50 OEMs worldwide on the implementation of its ADAS solutions.
Tier 1 Automotive Suppliers
The company supplies certain OEMs with the EyeQ platform through its arrangements with automotive system integrators, known as Tier 1 automotive suppliers, which are direct suppliers to OEMs. The company’s Tier 1 customers include Aptiv, Magna, Valeo, ZF, Imotion and others.
Mobility-as-a-Service
The company expects to sell the Mobileye Drive self-driving vehicles to a range of transportation network companies, public transit operators and vehicle OEMs, which intend to operate a variety of services (e.g., consumer-facing AMaaS, transportation on demand, delivery). These partners could produce vehicles themselves and integrate Mobileye Drive with its assistance.
Partnerships with STMicroelectronics and Intel
The company's long-standing relationship with STMicroelectronics N.V. (STMicroelectronics) continues to strengthen with the complexity of its solutions. The partnership includes close collaboration in product development, design, and manufacturing. For example, the company has co-developed six EyeQ generations, including the EyeQ6. The company also benefits from STMicroelectronics’ advanced packaging and testing capabilities and automotive expertise.
The company's close partnership with Intel exists on multiple fronts. As a result of its relationship with Intel, the company has access to certain technologies that support the design and development of its software-defined radar, including Intel’s mmWave technologies. Additionally, the company intends to explore a collaboration with Intel on a technology platform to integrate its EyeQ SoC with Intel’s market-leading central compute capability, with plans to utilize Intel Foundry Services’ advanced packaging capabilities. This potential platform is intended to enable functions essential to safety, entertainment, and cloud connectivity. Intel’s strength in government affairs and policy development around the world will continue to be of significant value to the company as it collaborates with regulators who are preparing frameworks to enable the commercial deployment of AVs.
Manufacturing
The company’s products are designed and manufactured specifically for automotive applications after extensive validation tests under stringent automotive environmental conditions.
The company partners with STMicroelectronics, a supplier and innovator of semiconductor devices for automotive applications, in manufacturing, design and research and development. The company has co-developed six generations of its automotive grade SoC, EyeQ, with STMicroelectronics, including EyeQ5 and EyeQ6. The company designs the front-end and STMicroelectronics designs the back-end package and also includes testing, quality assurance, customer care, failure analysis and manufacturing standards. All of the company’s EyeQ integrated circuits are manufactured by or outsourced to a partner foundry by STMicroelectronics.
The company has also established a relationship with Quanta Computer and other suppliers to develop and assemble its ECUs, including its reference design for the company’s Mobileye SuperVision solution, which includes its EyeQ5 SoCs from STMicroelectronics.
Regulation
The company’s data-collection processes implement strict methodologies to comply with data protection and privacy laws, including the EU General Data Protection Regulation (the GDPR), the U.K. General Data Protection Regulation, and the U.S. federal and state laws, including the California Consumer Privacy Act of 2018 (the CCPA), as amended by the California Privacy Rights Act of 2020 (the CPRA).
Competition
The company’s competitors in the silicon provider category include Ambarella, Advanced Micro Devices, Arriver / Qualcomm, Black Sesame Technologies, Horizon Robotics, Huawei, NVIDIA, NXP, Renesas Electronics, and Texas Instruments. The company’s competitors in the software-only category include StradVision, Autobrains and Wayve.
In the autonomous driving market, including AMaaS and consumer AV, the company faces competition from technology companies, internal development teams from the automakers themselves, sometimes in combination with investments in early-stage autonomous vehicle technology companies, Tier 1 automotive suppliers, and robotaxi providers. AMaaS competitors include Cruise, Motional, Waymo, and Zoox in the United States and Europe and Auto X, Baidu, Deeproute.ai, Didi Chuxing, Momenta, and WeRide in China. Consumer AV competitors include Sony, and Tesla, who are developing self-driving vehicles for consumers.
Moovit’s free and subscription-based application competition includes Alphabet, Apple, Citymapper, and Transit.
Distribution and Marketing
The company’s products are sold directly to customers throughout the world, or through distribution channels for its remaining inventory of aftermarket products meant for vehicles that do not come pre-equipped with ADAS technology.
The company actively promotes its brand and technologies to increase awareness and generate demand through direct marketing, as well as co-marketing programs. Its direct marketing to consumers and businesses primarily includes trade events, industry and consumer communications, and press relations. The company works closely with its existing customers to ensure that it is aware of their requirements and plans for future car models and can respond promptly and effectively.
Intellectual Property
As of December 28, 2024, the company held 382 U.S. patents, 74 European patents, 175 U.S. patent applications, 608 European and other non-U.S. patent applications, and provisional patent filings.
History
Mobileye Global Inc. was founded in 1999. The company was incorporated in 2022.