What is important when trying to optimize machine vision systems? We are proud to present Jonathan Phillips as one of the Keynote Speakers at VECS 2025. Mr. Phillips has worked at Google Android as head of image quality at their Silicon Valley headquarters. Now in Boulder, Colorado as VP of imaging science at Imatest, Phillips’ technical leadership has expanded the company’s software features, equipment portfolio, and consulting services across imaging market segments, including automotive. As the previous leader of image quality in Google’s Android organization, Phillips focused on the launch and establishment of Google’s Pixel phone line, particularly its camera and display image quality and features. We got the opportunity to ask Jonathan a few questions prior to VECS on March 18-19 in Gothenburg.
Could you please introduce yourself and your work at Imatest?
I am VP of Imaging Science at Imatest, a leading company in image quality testing based in Boulder, Colorado, USA that supports camera development for many of the world’s top companies across many industries, including automotive, mobile electronics, security, aerospace, and medical imaging. My roles alongside my Imatest executive role include being the Chair of the US National body for the ISO Technical Committee 42 and expert on the IEEE P2020 committee, both of which are international standards bodies that develop camera image quality metrics. In addition, I am the Secretary for the Board of Directors of the Society for Imaging Science and Technology (IS&T), which is a professional international organization dedicated to keeping members and others apprised of the latest scientific and technological developments in the field of imaging through conferences, educational programs, publications, and its website, imaging.org.
What will you speak about at VECS 2025?
My first talk on Day 1 will be broader in scope and provide the background on defining and measuring camera image quality in ADAS systems for the automotive industry. This talk will be part of Track 1 Technical Challenges & Solutions: Technologies Enabling Mobility Transformation. On Day 2, I will delve into more details and focus on optimizing camera images for machine vision systems. This talk will be part of Track B Autonomous Solutions & Active Safety: Laying the Road for Safe Autonomous Vehicles.
What is important when trying to optimize machine vision systems?
Most people are aware of machine learning models that are trained and generated to optimize machine vision systems that are fundamental to ADAS and autonomous driving. But, of equal importance is the optimization of the camera hardware and image processing pipeline of the camera that are used in the system. This is where Imatest and image quality metrics enter into the equation for optimization. Camera systems and the optical and processing components of the system should be closely evaluated for image quality in order to provide the best automotive camera system, without under or overengineering the design. This ensures that the optimal supply cost is achieved in addition to the optimal machine vision for ADAS and autonomous driving applications.
Which are the main obstacles when you are working with camera image quality in ADAS systems?
Important camera image quality attributes include sharpness, distortion, exposure, dynamic range, noise, and color. These attributes can be challenging to handle for ADAS systems because of speed, light, and weather conditions. For example, the video stream from a vehicle camera can be captured when it is moving at various speeds, from high-speed velocity to standing still. In addition, the light levels while driving can vary from completely dark to very bright outdoors, with the possibility of changing quite rapidly such as when a vehicle goes through a tunnel on a sunny day. Regarding the driving environment, extreme weather such as rainy and snowy conditions can impede the ADAS camera performance by occluding the optical path. These types of challenges can be quite different from testing cameras in a controlled lab environment, so we have to continue developing means of testing ADAS systems that tie real-world driving conditions to lab environments.
What impact do new international standards such as IEEE P2020 for CMOS image sensors have on the Automotive Industry?
The IEEE P2020 Automotive Standard has been developed over multiple years by experts from around the world regarding methodologies to measure camera performance, including flare, geometric calibration, noise, dynamic range, spatial frequency response, flicker, and contrast performance indicators. This and other work I am leading in the ISO Technical Committee 42 are developing new and important methodologies to ensure that we are making impactful decisions when designing the CMOS image sensors and camera systems for the automotive industry. The standardization of metrics ensures consistency and creates cross-industry reference points.
What is the most important thing you would like to share in your presentation?
I want the audience to understand how evaluation and choices of camera hardware can have a significant impact on optimizing the ADAS and machine vision performance of these AI-related aspects of automotive systems. This optimization of the camera system, both for visible and near infrared sensor detection, can enable better downstream capabilities of the vehicle by allowing for improved machine vision.
What are you most looking forward to by attending and speaking at the event?
I am excited about bringing detailed perspective on designing camera usage in ADAS and autonomous driving to VECS, which is a high-profile international event that includes many other aspects of connectivity and technology beyond camera image quality. The breadth of topics and applications will provide an inspiring setting for us to share, inform, and learn from each other. See you there in Gothenburg!