搜索
您的当前位置:首页Markerless Image-based 3D Tracking for Real-time Augmented

Markerless Image-based 3D Tracking for Real-time Augmented

来源:世旅网
MarkerlessImage-based3DTrackingforReal-timeAugmentedRealityApplications

R.Koch,K.Koeser,B.Streckel,J.-F.Evers-SenneInstituteofComputerScienceandAppliedMathematicsChristian-Albrechts-UniversityofKiel,24098Kiel,Germany

email:rk@informatik.uni-kiel.deAbstract

Inthiscontributionwedescribeavisualmarker-lessreal-timetrackingsystemforAugmentedRealityapplications.Thesystemusesafisheyelensmountedonafirewirecam-erawith10fpsforvisualtrackingof3Dscenepointswith-outanypriorsceneknowledge.Allvisual-geometricdataisacquiredonlineduringthetrackingusingastructure-from-motionapproach.2DImagefeaturesinthehemisphericalfisheyeimagearetrackedusinga2Dfeaturepointtracker.Trackingmaybefacilitatedbyorientationcompensationwithaninertialsensor.Basedontheimagetracks,3Dcam-eraegomotionand3Dfeaturesareestimatedonlinefromtheimagesequence.Thetrackingisrobusteveninthepresenceofmovingobjectsasthelargefieldofviewofthecamerastabilizesthetracking.

trackingalgorithmsfromroboticsandcomputervision.Inrobotics,therealtimeSLAMapproach(SimultaneousLo-calizationAndMapping)hasbeenusedwithnon-visualsensorslikeodometryandultrasound/lasersensors.Theseideaswererecentlyextendedtovisualtracking[4].Incom-putervision,offlineARandvisualreconstructionhasbeeninthefocusforsomeyears.ThedominantapproachinthisfieldistermedSfM(StructurefromMotion),wheresimulta-neouscameraposeestimation,evenfromuncalibratedcam-eras,and3Dstructurereconstructionispossible[8].Bothapproacheshavemuchincommonandcanbemergedto-wardsaversatilerealtimeARsystem[6].

2ONLINEARSYSTEMDESIGN

1INTRODUCTION

AugmentedReality(AR)systemsaimatthesuperpo-sitionofadditionalscenedataintothevideostreamofarealcamera.Onecandistinguishbetweenofflineaugmen-tationforspecialeffectsinvideopostproduction[3],andonlineaugmentation,whereausertypicallycarriesaheadmounteddisplay.Additionalinformationiseithersuper-imposeddirectlyontothevideostreamusingvideosee-throughdevicesoritisprojectedopticallyintothevisualpathoftheusersgazedirection[1,2].

ThetechnicalandalgorithmicdemandsforonlineARareverychallenging.TheARequipmentmustbecarriedbytheuserpossiblyforalongtime,henceitshouldbelight-weightandergonomicandnothinderfreemovements.Atthesametime,computationofcameraposemustbeveryfastandreliable,eveninuncooperativeenvironmentswithdifficultlightingsituation.Thiswillrequirehighcomputa-tionaldemandsonthesystem.

Recently,quitesomeresearchactivitiesononlineARwereundertaken.Theworkwasinspiredbytheonline

InthefollowingwewilldescribethecomponentsofanonlineARsystemthatallowsrobust3Dcameratrackingincomplexanduncooperativesceneswherepartsofthescenemaymoveindependently.ItisbasedontheSfMapproachfromcomputervision.Therobustnessisachievedintwoways:

1.A190Degreehemisphericalfisheyelensisusedthatcapturesaverylargefieldofviewofthescene.Ifusedinindoorenvironments,thehemisphericalviewwillalwaysseelotsofstaticvisualstructures,evenifthesceneinfrontoftheusermaychangedramat-ically.Thesystemisthereforemainlydesignedfor(butnotrestrictedto)indooruse,becauseinoutdoorscenesthesunlightfallingdirectlyontotheCCDsen-sorwillcauseproblems.Theseproblemscanbefacil-itatedwhenCMOSsensorswithlogarithmicresponseandhighdynamicrangeareused.2.The3Dtrackingisbasedonrobustcameraposeesti-mationusingstructurefrommotionalgorithms[8]thatareoptimizedforrealtimeperformance.Thesealgo-rithmscanhandlemeasurementoutliersfromthe2Dtrackingusingrobuststatistics.

2.1System

ThegoaloftheARsystemisalight-weightwearableso-lutionoutobstructingthatallowstherealtimeusermotion.augmentationThecomputationalviaaHMDloadwith-ofsuchasystemforsimultaneousrealtimetrackingandaug-mentationwearablecomputers.istoohightoWebehaveperformedthereforeoncurrentlydesignedavailablethesys-temdisplaywithandalightweighttheimagewearableacquisition,unitthatfortheisconnectedhead-mountedtoaback-endPCviaawirelessLANaccess.Inthiscontribu-tionunitweandaredoonlynothandleconcernedaugmentation.

withtherecordingandtrackingThevideocamerasystemmustbeextremelysmall.Wehavecrolenschosenadaptera640x480andamicrolensfirewirefisheye.cameraThewithimage12mmqualitymi-ofthefisheyelensdegradestowardstheboundaryofthehemisphere,degreeandathereforequadraticthesubimageopeningwithangle400x400isreducedpixeltoispro-160cessed,resultinginanangularresolutionof3pixel/degree.ThethusbackendWLANarawchannel.

datasystemrateofis1.6currentlyMB/sisabletransferredtoprocessthrough10fps,theDoFIninertialaddition,sensorthecameraat100Hzrotationrate.Theismeasuredrotationdatausingisuseda3toturecompensatepositions.

fastheadrotationsandtopredictimagefea-Thebackendsystemrunstwoseparatethreads(possiblyonfroma2-processorthe3DSfMunit)posethatandseparatestructurethecomputation.2DfeaturetrackingThees-timated3Dposeishandedbacktothewearableunitandvisualaugmentationissuperimposedontotheuserview.Figure1givesanoverviewonthesystemcomponents.

2.2Robusttrackingfromfisheyeimages

The3Dtrackingsystemisdividedintotrackinginitializa-tion,Thetracking2Dfeatureisfacilitatedtrackingbyandtherobust3DoF3Dinertialposerotationestimation.sen-sor.

Initializationafirstsetimageofsalientand2Dfeaturetracking:Inaninitialstep,ofthe2Dsequence.intensitycornersThese2Darefeaturesdetectedareinthenthetrackedthroughouttheimagesequencebylocalfeaturematchinglost,newtrackswiththeareKLTconstantlyoperatorreinitialized.[9].IffeatureThenewtrackstracksarearemergedwithprevioustracksinthe3Dstagetoavoiddrift.

Aswearehandlingsphericalimagesfromthefisheyelens,caremustbetakentocompensatethesphericaldis-tortions2Dmatching,usinglocalthe3Dplanarcamerarectification.rotationvelocityTofurtherismeasured

facilitateInertialHMDSensorFisheye CameraAugmentationRotationImage Seq.(VGA)Data(IEEE1394)Rendering2D WearablePrefilteringImage PC

3D CameraPoseRotation DataWLANPrefilteredImage SequenceThread BThread APose3D 2D EstimationTrackingFeatureBackendaided byPC

3D Featuresensor dataGenerationFigure1:OverviewofARsystem.

bysatedtheininertialtheimages.rotationCurrentlysensorandwecompensatetherotationtheiscompen-rotationonly,butaparallaxcompensationbybackprojectionof3Dfeaturesisplanned.

3Dgivenfeaturetoestimate2Dfeaturetrackingthemetrictracks,andcameraaSfMposeposeapproachestimation:and3D[7]featurecanbeFrompositionsappliedthesimultaneously.sentialmatrixbetweenGivenathesetviewsofreliablecanbe2Dcomputedfeatures,andtheEs-therelativeposeofthecamerascanbeextracted.Simultane-ously,2Dcorrespondences3Dfeaturepointsandcantherelativebetriangulatedpose.Thefromcamerathegivenposeandthe3Dfeaturepositionsaredeterminedwitharotationrelativeoverallscale.toaninitialThiscamerascalemustpositionbeinsertedandupintotoantheunknownsystemfromarigidexternalscenewheredata.ThetheestimatedSfMisbased3Dfeaturesontheassumptiondonotmoveofbetweenviews.Therefore,caremustbetakentohandlemovingobjectsandmeasurementoutliersrobustly.

Robustnessoftheestimationisintroducedbyrobuststa-tisticalevaluationoffeaturematchingandEssentialma-trixThus,computationmovingobjectsusingareRANdomtreatedSAmplingasmeasurementConsensusoutliers

[7].thateyearecamerasdiscardedleadsbytotheanRANSAC.especiallyrobustThetrackingtrackingfromforfish-tworeasons:

1.Theandmovingwidefieldobjectsofviewtendcoverstobeaonlyveryinwideasmallscenepartareaofthescene.Therefore,mostofthevisiblesceneisstatic.Second,tolargeandacamerajerkyrotations.mountedonThisahumanrotationsheadareispartiallysubjectcompensatedmayrotatethebycameratherotationquicklysensor,outofbutview.stillThistheheadwillnothappeneasilywithafisheyecamerawithhemi-sphericalview.2.Itcanbeshownthatawidefieldofviewstabilizesthesmallposefieldestimationofview,the[5].motionForperspectivetowardsthecamerasopticalwithaxisisthealwaysfocusillofdefinedcontractionbecause(FOC).thecameraOnlythemovesmotiontowardsper-pendiculartotheFOCcanbeestimatedreliably.Inasphericalthatisperpendicularimage,theretowillthealwaysFOC,hencebeanimagetheestimationpositionofthecameramotionisalwaysreliable.Aresolutiondrawbackofofthetheimage,sphericalhenceimagetheestimateisthelowforangularafish-eyelenscamerawillbelessaccuratethanestimatesfromangularasidewaysresolution.

movingperspectivecamerawithhigh3EXPERIMENTS

tem.WeInhavetheperformedfollowingsectionextensiveweexperimentswillgivesomewithresultsthesys-ontimingandonvisualcameratracking.

Figure2showscameratrackingresultsofasequenceof900spaceviews.andevenTherotatedcamerauptowas180moveddegreesextensivelyawayfromthroughtheini-tialviewpose.allowedStill,thattracking3Dfeatureswaspossiblewerevisiblesincetheforwidealongfieldtime.oftrackedToevaluateona400x400thetiming,pixeleitherimage50usingor100a3.0featuresGHzP4werewithsinglebeseparatedanddoubleinto2processortasksthatPC.run2Deitherand3Dconcurrentlytrackingcanonthe1-CPUorparallelonthe2-CPUPC.Thetimingtableinconfigurationtable1showsforthethat2-Processor10fpsarePC.

indeedpossibleinthisNo.feat.

3D

2-CPU

3040

145

98Table1:Timingoftrackingperframeinms.

Figure2:Visualizationof3Dcameratracks(image1andimagecorner.

650)withtheoriginalfisheyeimageintheupperrightsectionFigureof3theshowscameraaugmentationwasmappedresults,toaplanarwhereviewtheandcentralsyn-theticobjectswereplacedontherealtable.Theobjectsremainedintheirallocatedplacewithoutmuchjitter.

4CONCLUSIONS

Thepresentedapproachshowsthatarobustmarkerless3Dtrackingfromafisheyecamerasystemispossibleinre-altime.therfine-tuningThesystemisneeded.presentedTheisin3Danprocessingearlystageisandnotfur-yetoptimizedthisstage.toCurrently,speedandthereweisforeseenofeedbackstillsomefrompotentialthe3Dfea-inturesintothe2Dstage.Aproperpredictionfromthefull6prediction.DoFstateTheofthecovariancessystemwilloftheenhance3Dfeaturesthecurrentarecurrently3DoFevaluatednumericallywhichisacostlyoperation.Anan-alyticalthereiscurrentlysolutionwillnosolutionfurtherenhanceonthecomputationspeed.Furthermore,oftheab-

Figure3:Visualaugmentationofvirtualobjectsinarealscenesuperimposedoncentralpartofimages200and500.

solutescaleofthereconstruction.AugmentationwillneedthetransformationintotheEuclideanworld.Finally,weneedrecognizingbetterreinitializationobjectsandsalientstrategiesfeaturesforfromthe2Dimagetracksdatabytominimizedrift.

Acknowledgments

istryThisofworkSciencewasprojectfundedBMBF-ARTESASpartiallybytheGermanandtheEuro-Min-peanCommissionprojectIST-2003-2013MATRIS.

References

[1]R.Azuma.Asurveyofaugmentedreality.InPresence:

TeleoperatorsandVirtualEnvironments6,pages355–385,Aug.1997.[2]R.Intyre.Azuma,RecentBaillot,advancesBehringer,inaugmentedFeiner,Julier,reality.andInMac-IEEE

ComputerGraphicsandApplications,Vol.21,No.6,pages34–47,Nov.2001.

[3]G.Bazzoni,E.Bianchi,O.Grau,A.Knox,R.Koch,

F.Lavagetto,A.Parkinson,F.Pedersini,A.Sarti,G.–advancedThomas,toolsandS.andTubaro.techniquesTheforORIGAMIhigh-endProjectmixingandIEEEinteractionProceedingsbetweenof1strealInternationalandvirtualSymposiumconten.onIn3DDataProcessingVisualizationandTransmission(3DPVT’02),2002.[4]A.mappingJ.Davison.withaReal-timesinglecamera.simultaneousInProceedingslocalisationInterna-and

tionalConferenceComputerVision,Nice,2003.[5]A.J.Davison,Y.G.Cid,andN.Kita.Real-time3D

SLAMsiumonwithIntelligentwide-angleAutonomousvision.InVehicles,Proc.IFLisbonACSympo-,July2004.[6]A.J.Davison,W.W.Mayol,andD.W.Murray.Real-timesion.localisationInProceedingsandmappingoftheIEEEwithInternationalwearableactiveSympo-vi-siumonMixedandAugmentedReality,Tokyo,2003.[7]R.HartleyandA.Zisserman.MultipleViewGeometry

inComputerVision.Cambridgeuniversitypress,2000.

[8]M.calibrationPollefeys,andmetricR.Koch,reconstructionandL.J.inV.spiteGool.ofvaryingSelf-andJournalunknownofComputerinternalVisioncamera,32(1):7–25,parameters.1999.International[9]J.ShiandC.Tomasi.Goodfeaturestotrack.InCon-ferenceonComputerVisionandPatternRecognition,pages593–600,Seattle,June1994.IEEE.

因篇幅问题不能全部显示,请点此查看更多更全内容

Top