您好,欢迎访问三七文档
VisualOdometryandMappingforAutonomousFlightUsinganRGB-DCameraAlbertS.Huang,AbrahamBachrach,PeterHenry,MichaelKrainin,DanielMaturana,DieterFox,NicholasRoyAbstractRGB-Dcamerasprovidebothacolorimageandper-pixeldepthesti-mates.Therichnessoftheirdataandtherecentdevelopmentoflow-costsensorshavecombinedtopresentanattractiveopportunityformobileroboticsresearch.Inthispaper,wedescribeasystemforvisualodometryandmappingusinganRGB-Dcamera,anditsapplicationtoautonomousflight.Byleveragingresultsfromrecentstate-of-the-artalgorithmsandhardware,oursystemenables3Dflightinclutteredenvironmentsusingonlyonboardsensordata.Allcomputationandsensingrequiredforlocalpositioncontrolareperformedonboardthevehicle,reducingthedepen-denceonunreliablewirelesslinks.Weevaluatetheeffectivenessofoursystemforstabilizingandcontrollingaquadrotormicroairvehicle,demonstrateitsuseforconstructingdetailed3Dmapsofanindoorenvironment,anddiscussitslimitations.1IntroductionStableandprecisecontrolofanautonomousmicroairvehicle(MAV)demandsfastandaccurateestimatesofthevehicle’sposeandvelocity.Inclutteredenvironmentssuchasurbancanyons,underaforestcanopy,andindoorareas,knowledgeofthe3Denvironmentsurroundingthevehicleisadditionallyrequiredtoplancollision-freetrajectories.Navigationsystemsbasedonwirelesslytransmittedinformation,suchasGlobalPositioningSystem(GPS)technologies,arenottypicallyusefulinAlbertS.Huang,AbrahamBachrach,andNicholasRoyMassachusettsInstituteofTechnology,ComputerScienceandArtificialIntelligenceLaboratory,CambridgeMA02139,e-mail:albert,abachrac,nickroy@csail.mit.eduPeterHenry,MichaelKrainin,andDieterFoxUniversityofWashington,DepartmentofComputerScience&Engineering,Seattle,WAe-mail:peter,mkrainin,fox@cs.washington.eduDanielMaturanaDepartmentofComputerScience,PontificiaUniversidadCat´olicadeChile,Santiago,Chilee-mail:dimatura@puc.cl12Huanget.al.Fig.1Ourquadrotormicroairvehicle(MAV).TheRGB-Dcameraismountedatthebaseofthevehicle,tiltedslightlydown.thesescenariosduetolimitedrange,precision,andreception.Thus,theMAVmustestimateitsstateandplancollision-freetrajectoriesusingonlyitsonboardsensors.RGB-DcamerascaptureRGBcolorimagesaugmentedwithdepthdataateachpixel.Avarietyoftechniquescanbeusedforproducingthedepthestimates,suchastime-of-flightimaging,structuredlightstereo,densepassivestereo,laserrangescanning,etc.Whilemanyofthesetechnologieshavebeenavailabletoresearchersforyears,therecentapplicationofstructuredlightRGB-Dcamerastohomeenter-tainmentandgaming[32]hasresultedinthewideavailabilityoflow-costRGB-Dsensorswell-suitedforroboticsapplications.Inparticular,theMicrosoftKinectsen-sor,developedbyPrimeSense,providesa640480RGB-Dimageat30Hz.Whenstrippeddowntoitsessentialcomponents,theKinectweighs115g–lightenoughtobecarriedbyasmallMAV.Previously,wehavedevelopedalgorithmsforMAVflightinclutteredenviron-mentsusingLIDAR[3]andstereocameras[1].LIDARsensorscurrentlyavailableinformfactorsappropriateforuseonaMAVareveryhighprecision,butonlyproviderangemeasurementsalongaplanearoundthesensor.Sincetheycanonlydetectobjectsthatintersectthesensingplane,theyaremostusefulinenvironmentscharacterizedbyverticalstructures,andlesssoinmorecomplexscenes.StructuredlightRGB-Dcamerasarebaseduponstereotechniques,andthussharemanypropertieswithstereocameras.Theprimarydifferenceslieintherangeandspatialdensityofdepthdata.SinceRGB-Dcamerasilluminateascenewithanstructuredlightpattern,theycanestimatedepthinareaswithpoorvisualtexture,butarerange-limitedbytheirprojectors.Thispaperpresentsourapproachtoprovidinganautonomousmicroairvehiclewithfastandreliablestateestimatesanda3Dmapofitsenvironmentbyusinganon-boardRGB-Dcameraandinertialmeasurementunit(IMU).Together,theseal-lowtheMAVtosafelyoperateincluttered,GPS-deniedindoorenvironments.Thecontrolofamicroairvehiclerequiresaccurateestimationofnotonlythepositionofthevehiclebutalsothevelocity–estimatesthatouralgorithmsareabletoprovide.Estimatingavehicle’s3DmotionfromsensordatatypicallyconsistsofestimatingitsrelativemotionateachtimestepbyaligningsuccessivesensormeasurementssuchaslaserscansorRGB-Dframes,aprocessmostoftenknownas“visualodom-etry”whencomparingcameraorRGB-Dimages.TheprimarycontributionofthisVisualOdometryandMappingforAutonomousFlightUsinganRGB-DCamera3paperistoprovideasystematicexperimentalanalysisofhowthebestpracticesinvisualodometryusinganRGB-Dcameraenablethecontrolofamicroairvehicle.Givenknowledgeoftherelativemotionofthevehiclefromsensorframetosen-sorframe,the3Dtrajectoryofthevehicleintheenvironmentcanbeestimatedbyintegratingtherelativemotionestimatesovertime.Givenknowledgeofthevehiclepositionintheenvironment,thelocationsofobstaclesineachsensorframecanalsobeusedtoconstructaglobalmap.However,whileoftenusefulforlocalpositioncontrolandstability,visualodometrymethodssufferfromlong-termdriftandarenotsuitableforbuildinglarge-scalemaps.Tosolvethisproblem,wealsodemon-stratehowourpreviousworkonRGB-DMapping[14]canbeincorporatedtodetectloopclosures,correctforaccumulateddriftandmaintainarepresentationofconsis-tentposeestimatesoverthehistoryofpreviousframes.Wedes
本文标题:Visual Odometry and Mapping for Autonomous Flight
链接地址:https://www.777doc.com/doc-3354775 .html