《2-12583-Adaptive Computing Powering Fusion from Sensors to Domain Controller.pdf》由会员分享,可在线阅读,更多相关《2-12583-Adaptive Computing Powering Fusion from Sensors to Domain Controller.pdf(20页珍藏版)》请在三个皮匠报告上搜索。
1、Adaptive Computing Powering Adaptive Computing Powering Fusion from Sensors to Domain Fusion from Sensors to Domain ControllerControllerGarfield Mao Sr.Product Marketing Manager,AMDGarfield Mao Sr.Product Marketing Manager,AMDDr.Dr.RuitongRuitong Zheng CTO of Zheng CTO of TanwayTanway Tech TechJune
2、14June 14th th 2023 20232|AMD Official Use Only-GeneralAbstractAbstract:“Purely vision process is easily failure in harsh weather conditions,as well as the failure of similar sensors,LiDAR fusion with vision increased reliability.The challenge is calibration,it is necessary to find the alignment poi
3、nt between the two sensors in real-time processing.China regional customer Tanway take advantage of the high parallelism of AMD FPGA on spatial alignment;and take advantage of the high precision of AMD FPGA on time synchronization.Finally realized hardware-level image pre-fusion”3|AMD Official Use O
4、nly-GeneralTopicsFusion Requires High Precision and High ParallelismCase Study-Pre-image Fusion(Tanway Technology)Fusion Requires High Precision and High Parallelism5|AMD Official Use Only-GeneralPerception Trends More Sensors Bigger Data Complex Neural NetworkGB/sTB/s x10CameraLiDARRadarRadarCamera
5、2D3DRGBXYZ From 2D to 3D Fusion RGB and XYZ6|AMD Official Use Only-GeneralPlanning DecisionBehavioral DecisionWorking Flow Overview of Autonomous DrivingPreception and fusion are still the most important pieces in AD.PreceptionPositioningCalibrationPerception Fusion PositioningMulti-laser calibratio
6、nLaser scanningPoint cloud mapjoint calibration+remove distortBehavioral DecisionScenario DecisionRoute PlanExecutive ControlControl AlgorithmGlobal PlanningLocal PlanningCrossroads ScenarioGateway Convergence ScenarioLongitudinal Control AlgorithmLateral Control AlgorithmChassis Controlsteering7|AM
7、D Official Use Only-GeneralMulti-Sensor Fusion of Autonomous DrivingClassificationColor detection Robustness to Interference Adverse Weather Night operation Sensor costSoC Processing Performance Velocity detection Angle Resolution Range Resolution Range/FoVLiDARCameraRADARLearn from each others stre
8、ngths and complement each others weaknesses.8|AMD Official Use Only-GeneralDiscussion about FusionImage BranchLiDAR Branch9|AMD Official Use Only-GeneralChallenge in Central Compute and Edge Sensors Challenge in Central ComputeChallenge in Central ComputeThroughput bottleneckRaw Data,IO Performance,
9、Output CapabilityPoor ScalabilityDifferent Protocol,Data Types,Extension CapabilityDifficult Clock AlignmentRequire high precision time synchronizationDisrupted NN Acceleration FlowContinuous preprocessing will interrupt neural network algorithm processingChallenge at Edge SensorsChallenge at Edge S
10、ensorsSingle sensor is prone to failureSimilar sensors will fail due to the same source problemCalibrationRequires constant calibration and matching of data3D FusionEvolving algorithms and exploration of new fusion methods Require Adaptive Platform to Cover both Domain Processing and Sensors10|AMD O
11、fficial Use Only-GeneralAMD Adaptive Computing Powering FusionCustomizable IP Flexible Algorithm Synergy between Software and HardwareHigh Parallelism Efficient processing of large amounts of parallel dataFunctional Safety Meet ISO26262Flexible Heterogeneous Adapt to different types of sensorsScalab
12、le I/O Unlimited type and quantityHigh Precision Hardware-timed clock synchronizationLow Latency Unlimited parallel pipeline depthInformation Security Data Security and Privacy Protection Features11|AMD Official Use Only-GeneralExample of Sensor Hub SolutionsPHYsMulti-Sensor FusionSafetyProcessor(s)
13、PHYsComputeAccelerator(s)HighPerformanceSerialProcessor(s)MemMemMemMemDistributed SensorsAD Centralized Processing Module(s)Vehicle Control&StatusHMICamera Sensors RadarSensorsLiDARSensorsData Aggregation,Pre-processing and Distribution;Software and Hardware Decoupling;Better Workload Balance12|AMD
14、Official Use Only-GeneralDiscussion on Camera vs LiDARCamera PixelCamera PixelCharacteristic:F(u,v)=R,G,B,RGB image+Gray imageAdvantage:Rich texture informationDefect:Insufficient depth information,challenge on measuring distanceScenario:Object is occluded,bring serious challenges to target object d
15、etection and semantic segmentation.LiDAR Point CloudLiDAR Point CloudCharacteristic:Point Based,(x,y,z,r),r represent reflectivity of each point.Voxel Based,Xv=x1,x2,x3xn,xi represent a eigenvector xi=si,vi,si represent the center of voxel cube.Advantage:High resolution 3D scan,high sensitivity Defe
16、ct:Different resolutions at different distances,missing color information Scenario:In bad weather,bring challenges on accuracy and Robustness Camera and LiDAR FusionCamera and LiDAR Fusion CPM(Camera-Plane Map)Projected into camera coordinates,lower resolution because the distant point cloud will be
17、 relatively sparse BEV(Birds Eye View)Avoids the occlusion problem,encode point clouds with height,density and intensity maps,but storage redundancy for a large number of contextsCase Study-Pre-image Fusion(Tanway Technology)14|AMD Official Use Only-GeneralWhats Hardware-based Pre-image Fusion?Pre-f
18、usion means that the raw data from multiple sensors are fused first,then unified to do classification and calibration.The raw data from images and laser point clouds can provide richer perceptual information,to achieve alignment and synchronization frame by frame,column by column,point by point,whic
19、h could be perfectly compatible with the pre-fusion perception algorithm.Based on the hardware level,the fusion perception system of LiDAR and Camera can easily surpass the human eyes,give full play to the advantages of high image resolution and accurate laser point cloud ranging;Hardware-based Pre-
20、image Fusion could guarantee hardware configuration and hardware cost without reducing perception accuracy,even improve the perception accuracy,so as to achieve the optimal perception effect.15|AMD Official Use Only-GeneralHardware-based Pre-fusionTrue hardware-based pre-fusion(More complete environ
21、mental information,less likely to be missed or undetected,and improved data robustness.)Laser measurement points correspond to multi-pixel units and the multi-dimensional fused information data stream is packaged for output.Fusion DataLiDAR Point CloudCamera Images16|AMD Official Use Only-GeneralHar
22、dware-level Time-synchronizationHardware-level Time-synchronizationHardware level spatial alignment and time synchronization(Accurate target acquisition,less susceptible to displacement and real-time determination of relative position to target.)Solid-state T&R optical system for frame-by-frame,colu
23、mn-by-column and point-by-point time synchronization.Fusion DataSynchronous triggeringPoint CloudsImagesSynchronized trigger system to obtain microsecond time synchronization.Coaxial optical system for point-to-point correspondence of raw data.17|AMD Official Use Only-GeneralSensors 0 CalibrationLiD
24、AR and image sensor 0 calibration(The relative position is known and no calibration is required,reducing the cost overhead of calibration and allowing errors to be reduced.)The sensor hardware is assembled to achieve the fusion effect.18|AMD Official Use Only-GeneralSummaryFusion is the way to go fo
25、r autonomous driving,preception and fusion are still the most important pieces in AD.Both central compute and sensors are facing the challengeAMD adaptive computing powering fusion in high precision and high parallelismPre-image fusion can be mass of production by Tanway Tech20|AMD Official Use Only
26、-GeneralDISCLAIMERThe information contained herein is for informational purposes only and is subject to change without notice.While every precaution has been taken in the preparation of this document,it may contain technical inaccuracies,omissions and typographical errors,and AMD is under no obligat
27、ion to update or otherwise correct this information.Advanced Micro Devices,Inc.makes no representations or warranties with respect to the accuracy or completeness of the contents of this document,and assumes no liability of any kind,including the implied warranties of noninfringement,merchantability
28、 or fitness for particular purposes,with respect to the operation or use of AMD hardware,software or other products described herein.No license,including implied or arising by estoppel,to any intellectual property rights is granted by this document.Terms and limitations applicable to the purchase or
29、 use of AMD products are as set forth in a signed agreement between the parties or in AMDs Standard Terms and Conditions of Sale.GD-18 2023 Advanced Micro Devices,Inc.All rights reserved.AMD,the AMD Arrow logo,and combinations thereof are trademarks of Advanced Micro Devices,Inc.Other product names used in this publication are for identification purposes only and may be trademarks of their respective owners.