✅ 操作成功!

ransac算法

发布时间:2023-06-05 作者:admin 来源:文学

ransac算法

ransac算法

高中物理教学反思-论述题格式

2023年2月18日发(作者:易盛极星)

RANSAC算法⽂献

RandomsampleconsensusRandomsampleconsensus(RANSACRANSAC)isantoestimateparametersofamathematicalmodelfromasetofobserved

datathatcontains,ore,italsocanbe

on-deterministicalgorithminthesensethatitproducesareasonable

resultonlywithacertainprobability,orithmwasfirst

edRANSACtosolvetheLocationDeterminationProblem(LDP),where

thegoalistodeterminethepointsinthespacethatprojectontoanimageintoasetoflandmarkswithknownlocations.

Abasicassumptionisthatthedataconsistsof"inliers",i.e.,datawhosedistributioncanbeexplainedbysomesetofmodel

parameters,thoughmaybesubjecttonoise,and"outliers"lierscancome,

forexample,fromextremevaluesofthenoiseorfromerroneousmeasurementsorincorrecthypothesesaboutthe

alsoassumesthat,givena(usuallysmall)setofinliers,thereexistsaprocedurewhichcan

estimatetheparametersofamodelthatoptimallyexplainsorfitsthisdata.

Contents

Example[]

ngthatthissetcontainsbothinliers,i.e.,points

whichapproximatelycanbefittedtoaline,andoutliers,pointswhichcannotbefittedtothisline,aforlinefittingwill

sonisthatitisoptimallyfittedtoallpoints,includingtheoutliers.

RANSAC,ontheotherhand,attemptstoexcludetheoutliersandfindalinearmodelthatonlyusestheinliersinits

donebyfittinglinearmodelstoseveralrandomsamplingsofthedataandreturningthemodelthathas

heinlierstendtobemorelinearlyrelatedthanarandommixtureofinliersand

outliers,artice,thereisnoguaranteethata

subsetofinlierswillberandomlysampled,andtheprobabilityofthealgorithmsucceedingdependsontheproportionof

inliersinthedataaswellasthechoiceofseveralalgorithmparameters.

Adatasetwithmanyoutliersforwhichalinehastobefitted.

FittedlinewithRANSAC;outliershavenoinfluenceontheresult.

Overview[]

TheRANSACalgorithmisalearningtechniquetoestimateparametersofamodelbyrandomsamplingofobserveddata.

Givenadatasetwhosedataelementscontainbothinliersandoutliers,RANSACusesthevotingschemetofindtheoptimal

lementationofthisvoting

schemeisbasedontwoassumptions:thatthenoisyfeatureswillnotvoteconsistentlyforanysinglemodel(fewoutliers)

andthereareenoughfeaturestoagreeonagoodmodel(fewmissingdata).TheRANSACalgorithmisessentially

composedoftwostepsthatareiterativelyrepeated:

irststep,asamplesubsetng

modelandthecorrespondingm

cardinalityofthesamplesubsetisthesmallestsufficienttodeterminethemodelparameters.

econdstep,thealgorithmcheckswhichelementsoftheentiredatasetareconsistentwiththemodel

inlementwillbeconsideredasan

outlierifitdoesnotfitthefittingmodelinstantiatedbythesetofestimatedmodelparameterswithinsomeerror

thresholdthatdefinesthemaximumdeviationattributabletotheeffectofnoise.

SACalgorithmwilliterativelyrepeatthe

abovetwostepsuntiltheobtainedconsensussetincertainiterationhasenoughinliers.

TheinputtotheRANSACalgorithmisasetofobserveddatavalues,awayoffittingsomekindofmodeltothe

observations,achievesitsgoalbyrepeatingthefollowingsteps:

issubsetthehypotheticalinliers.

isfittedtothesetofhypotheticalinliers.

ointsthatfittheestimatedmodelwell,accordingto

somemodel-specific,areconsideredaspartoftheconsensusset.

imatedmodelisreasonablygoodifsufficientlymanypointshavebeenclassifiedaspartoftheconsensusset.

ards,themodelmaybeimprovedbyreestimatingitusingallmembersoftheconsensusset.

Thisprocedureisrepeatedafixednumberoftimes,eachtimeproducingeitheramodelwhichisrejectedbecausetoofew

pointsarepartoftheconsensusset,atter

case,wekeeptherefinedmodelifitsconsensussetislargerthanthepreviouslysavedmodel.

RANSAC:InliersandOutliers.

Algorithm[]

ThegenericRANSACalgorithmworksasfollows:

Return:

bestFit–modelparameterswhichbestfitthedata(ornulifnogoodmodelisfound)

iterations=0

bestFit=nul

bestErr=somethingreallylarge

whileiterations

maybeInliers=nrandomlyselectedvaluesfromdata

maybeModel=modelparametersfittedtomaybeInliers

alsoInliers=emptyset

foreverypointindatanotinmaybeInliers{

ifpointfitsmaybeModelwithanerrorsmallerthant

addpointtoalsoInliers

}

ifthenumberofelementsinalsoInliersis>d{

%thisimpliesthatwemayhavefoundagoodmodel

%nowtesthowgooditis

betterModel=modelparametersfittedtoallpointsinmaybeInliersandalsoInliers

thisErr=ameasureofhowwellbetterModelfitsthesepoints

ifthisErr

bestFit=betterModel

bestErr=thisErr

}

}

incrementiterations

}

returnbestFit

Matlabimplementation[]

Given:

data–asetofobservations

model–amodeltoexplainobserveddatapoints

n–minimumnumberofdatapointsrequiredtoestimatemodelparameters

k–maximumnumberofiterationsallowedinthealgorithm

t–thresholdvaluetodeterminedatapointsthatarefitwellbymodel

d–numberofclosedatapointsrequiredtoassertthatamodelfitswelltodata

AMatlabimplementationof2DlinefittingusingtheRANSACalgorithm:

%%Plotthedatapoints

figure;plot(data(1?,data(2?,‘o’);holdon;

number=size(data,2);%Totalnumberofpoints

bestInNum=0;%Bestfittinglinewithlargestnumberofinliers

bestParameter1=0;bestParameter2=0;%parametersforbestfittingline

fori=1:iter

%%Randomlyselect2points

idx=randperm(number,num);sample=data(:,idx);

%%Computethedistancesbetweenallpointswiththefittingline

kLine=sample(:,2)-sample(:,1);%twopointsrelativedistance

kLineNorm=kLine/norm(kLine);

normVector=[-kLineNorm(2),kLineNorm(1)];%Ax+By+C=0A=-kLineNorm(2),B=kLineNorm(1)

distance=normVector(data-repmat(sample(:,1),1,number));

%%Computetheinlierswithdistancessmallerthanthethreshold

inlierIdx=find(abs(distance)<=threshDist);

inlierNum=length(inlierIdx);

%%Updatethenumberofinliersandfittingmodelifbettermodelisfound

ifinlierNum>=round(inlierRationumber)&&inlierNum>bestInNum

bestInNum=inlierNum;

parameter1=(sample(2,2)-sample(2,1))/(sample(1,2)-sample(1,1));

parameter2=sample(2,1)-parameter1*sample(1,1);

bestParameter1=parameter1;bestParameter2=parameter2;

end

end

%%Plotthebestfittingline

xAxis=-number/2:number/2;

yAxis=bestParameter1*xAxis+bestParameter2;

plot(xAxis,yAxis,‘r-’,‘LineWidth’,2);

end

%%Generaterandomdatafortest

data=150(2rand(2,100)-1);data=data.*rand(2,100);

ransac_demo(data,2,100,10,0.1);

Parameters[]

Thethresholdvaluetodeterminewhenadatapointfitsamodelt,andthenumberofclosedatapointsrequiredtoassert

thatamodelfitswelltodatadaredeterminedbasedonspecificrequirementsoftheapplicationandthedataset,and

berofiterationsk,however,canbedeterminedasafunctionofthe

thedesiredprobabilitythattheRANSACalgorithm

returnsasuccessfulresultifinsomeiterationitselectsonlyinliersfrom

theinputdatase{displaystylew}be

theprobabilityofchoosinganinliereachtimeasinglepointisselected,thatis,

function[bestParameter1,bestParameter2]=ransac_demo(data,num,iter,threshDist,inlierRatio)

%data:a2xndatasetwith#ndatapoints

%num:efittingproblem,num=2

%iter:thenumberofiterations

%threshDist:thethresholdofthedistancesbetweenpointsandthefittingline

%inlierRatio:thethresholdofthenumberofinliers

w{displaystylew}=numberofinliersindata/numberofpointsindata

Acommoncaseisthatw{displaystylew}isnotwellknownbeforehand,ng

thatthenpointsneededforestimatingamodelareselectedindependently,wn{displaystylew^{n}}isthe

probabilitythatallnpointsareinliersand1−wn{displaystyle1-w^{n}}istheprobabilitythatatleastoneofthen

pointsisanoutlier,obabilitytothepower

ofkistheprobabilitythatthealgorithmneverselectsasetofnpointswhichallareinliersandthismustbethesameas1

−p{displaystyle1-p}.Consequently,

1−p=(1−wn)k{displaystyle1-p=(1-w^{n})^{k},}

which,aftertakingthelogarithmofbothsides,leadsto

k=log(1−p)log(1−wn){displaystylek={frac{log(1-p)}{log(1-w^{n})}}}

Thisresultassumesthatthendatapointsareselectedindependently,thatis,apointwhichhasbeenselectedonceis

oftennotareasonableapproachandthederivedvaluefork

shouldbetakenasmple,inthecaseof

findingalinewhichfitsthedatasetillustratedintheabovefigure,theRANSACalgorithmtypicallychoosestwopointsin

eachiterationandcomputesmaybe_modelasthelinebetweenthepointsanditisthencriticalthatthetwopointsaredistinct.

Togainadditionalconfidence,ndarddeviationofkisdefinedas

SD(k)=1−wnwn{displaystyleoperatorname{SD}(k)={frac{sqrt{1-w^{n}}}{w^{n}}}}

Advantagesanddisadvantages[]

Thcedmaterialmaybechallengedandremoved.(September2014)()

AnadvantageofRANSACisitsabilitytodoofthemodelparameters,i.e.,itcanestimatetheparameterswithahighdegree

vantageofRANSACisthatthereisno

upperboundonthetimeittakestocomputetheseparameters(exceptexhaustion).Whenthenumberofiterations

computedislimitedthesolutionobtainedmaynotbeoptimal,

thiswayRANSACoffersatrade-off;bycomputingagreaternumberofiterationstheprobabilityofareasonablemodel

er,RANSACisnotalwaysabletofindtheoptimalsetevenformoderately

contaminatedsetsanditusuallyperformsbadlywhenthenumberofinliersislessthan50%.OptimalRANSACwas

proposedtohandleboththeseproblemsandiscapableoffindingtheoptimalsetforheavilycontaminatedsets,evenforan

inlierratiounder5%.AnotherdisadvantageofRANSACisthatitrequiresthesettingofproblem-specificthresholds.

nyone-modelapproachwhentwo(ormore)model

instancesexist,nealternativerobustestimationtechniquethatmaybeuseful

rapproachformultimodelfittingisknownasPEARL,which

combinesmodelsamplingfromdatapointsasinRANSACwithiterativere-estimationofinliersandthemulti-modelfitting

beingformulatedasanoptimizationproblemwithaglobalenergyfunctionaldescribingthequalityoftheoverallsolution.

Applications[]

TheRANSACalgorithmisoftenusedin,e.g.,tosimultaneouslysolvetheandestimatetherelatedtoapairofstereo

cameras.

Developmentandimprovements[]

Since19812006,forthe25th

anniversaryofthealgorithm,aworkshopwasorganizedattheInternational(CVPR)tosummarizethemostrecent

contributionsandvariationstotheoriginalalgorithm,mostlymeanttoimprovethespeedofthealgorithm,therobustness

andaccuracyoftheestimatedsolutionandtodecreasethedependencyfromuserdefinedconstants.

RANSACcanbesensitivetothechoiceofthecorrectnoisethresholdthatdefineswhichdatapointsfitamodelinstantiated

thresholdistoolarge,thenallthehypothesestendtoberankedequally(good).On

theotherhand,whenthenoisethresholdistoosmall,theestimatedparameterstendtobeunstable(lyadding

orremovingadatumtothesetofinliers,theestimateoftheparametersmayfluctuate).Topartiallycompensateforthis

undesirableeffect,edtwomodificationofRANSACcalledMSAC(M-estimatorSAmpleandConsensus)

andMLESAC(MaximumLikelihoodEstimationSAmpleandConsensus).Themainideaistoevaluatethequalityofthe

consensusset(athatfitamodelandacertainsetofparameters)calculatingitslikelihood(whereasinthe

originalformulationbyFischlerandBollestherankwasthecardinalityofsuchset).AnextensiontoMLESACwhichtakes

intoaccountthepultingalgorithmis

imilarlines,Chumproposedtoguidethesamplingprocedureifsomeaprioriinformation

regardingtheinputdataisknown,posedapproachiscalled

PROSAC,PROgressiveSAmpleConsensus.

oposedarandomizedversionofRANSACcalledR-RANSACtoreducethecomputationalburdento

icideaistoinitiallyevaluatethegoodnessofthecurrentlyinstantiatedmodelusingonlya

strategywilltellwithhighconfidencewhenitisthecaseto

evaluateasonabletothinkthattheimpact

ofthiseofstrategyproposedbyChum

érproposedaparadigmcalledPreemptiveRANSACthatallowsrealtimerobust

eideaoftheapproachconsistsin

generatingafixednumberofhypothesissothatthecomparisonhappenswithrespecttothequalityofthegenerated

hypothesisratherthanagainstsomeabsolutequalitymetric.

Otherresearcherstriedtocopewithdifficultsituationswherethenoisescaleisnotknownand/ormultiplemodelinstances

enteachdatumwiththe

ltiplemodelsarerevealedasclusterswhich

steringalgorithm,calledJ-linkage,doesnotrequirepriorspecification

ofthenumberofmodels,nordoesitnecessitatemanualparameterstuning.

RANSAChasalsobeentailoredforrecursivestateestimationapplications,wheretheinputmeasurementsarecorruptedby

outliersandKalmanfilterapproaches,whichrelyonaGaussiandistributionofthemeasurementerror,aredoomedtofail.

SuchanapproachisdubbedKALMANSAC.

Relatedmethods[]

(MaximumLikelihoodEstimateSAmpleConsensus)–thatthedatawasgeneratedfromthesample-fittedmodel,e.g.a

ofinliersandoutliers

(MaximumAPosteriorSAmpleConsensus)–extendsMLESACtoincorporateaoftheparameterstobefittedand

maximizesthe

–ofthestateofa

Notes[]

ttingandUncertainty,,SpringerVieweg(2ndedition,2016)

Statistics,,Wiley,1981(republishedinpaperback,2004),page1.

Hast,JohanNysjö,AndreaMarchetti(2013)."OptimalRANSAC–TowardsaRepeatableAlgorithmfor

FindingtheOptimalSet".JournalofWSCG21(1):21–30.

Isack,YuriBoykov(2012)."Energy-basedGeometricMulti-ModelFitting".InternationalJournalofComputer

Vision97(2:1):23–:10.1007/s11263-011-0474-7.

man,MLESAC:Anewrobustestimatorwithapplicationtoestimatingimagegeometry,

JournalofComputerVisionandImageUnderstanding78(2000),no.1,138–156.

,Guided-MLESAC:Fasterimagetransformestimationbyusingmatchingpriors,IEEE

TransactionsonPatternAnalysisandMachineIntelligence27(2005),no.10,1523–1535.

ngwithPROSAC–progressivesampleconsensus,ProceedingsofConferenceonComputerVisionand

PatternRecognition(SanDiego),vol.1,June2005,pp.220–226

,RandomizedRANSACwithTd,dtest,13thBritishMachineVisionConference,September

2002.

ér,PreemptiveRANSACforlivestructureandmotionestimation,IEEEInternationalConferenceonComputer

Vision(Nice,France),October2003,pp.199–206.

,Robustadaptive-scaleparametricmodelestimationforcomputervision.,IEEETransactionson

PatternAnalysisandMachineIntelligence26(2004),no.11,1459–1474

lo,Robustmultiplestructuresestimationwithjlinkage,EuropeanConferenceonComputer

Vision(Marseille,France),October2008,pp.537–547.

i,,,,KALMANSAC:Robustfilteringbyconsensus,Proceedingsofthe

InternationalConferenceonComputerVision(ICCV),vol.1,2005,pp.633–640

References[]

er&(June1981).(PDF)..2424(6):381–395.:.

h&JeanPonce(2003).ComputerVision,ceHall..

RichardHartleyand(2003).MultipleViewGeometryinComputerVision(2nded.).CambridgeUniversityPress.

Strutz,T.(2016).DataFittingandUncertainty(Apracticalintroductiontoweightedleastsquaresandbeyond).2nd

edition,SpringerVieweg..

&(1997)."TheDevelopmentandComparisonofRobustMethodsforEstimatingthe

FundamentalMatrix".InternationalJournalofComputerVision.2424(3):271–300.:.

OndrejChum(2005).(PDF).PhDThesis.

SunglokChoi;TaeminKim&WonpilYu(2009).(PDF).InProceedingsoftheBritishMachineVisionConference

(BMVC).

AndersHast;JohanNysjö;AndreaMarchetti(2013).(PDF).JournalofWSCG.2121(1):21–30.

HossamIsack;YuriBoykov(2012).(PDF).InternationalJournalofComputerVision.9797(2:1):23–147.:.

👁️ 阅读量:0