ComputationalNeurobiologyLaboratory,
8002 naJ 2 ]CN.oib-q[ 1v1130.108:0viXratheSalkInstituteforBiologicalStudies,LaJolla,CA92037
sharpee@salk.edu
Abstract
Thispapercomparesafamilyofmethodsforcharacterizingneuralfeatureselec-tivitywithnaturalstimuliintheframeworkofthelinear-nonlinearmodel.Inthismodel,theneuralfiringrateisanonlinearfunctionofasmallnumberofrelevantstimuluscomponents.Therelevantstimulusdimensionscanbefoundbymax-imizingoneofthefamilyofobjectivefunctions,R´enyidivergencesofdifferentorders[1,2].Weshowthatmaximizingoneofthem,R´enyidivergenceofor-der2,isequivalenttoleast-squarefittingofthelinear-nonlinearmodeltoneuraldata.Next,wederivereconstructionerrorsinrelevantdimensionsfoundbymax-imizingR´enyidivergencesofarbitraryorderintheasymptoticlimitoflargespikenumbers.WefindthatthesmallesterrorsareobtainedwithR´enyidivergenceoforder1,alsoknownasKullback-Leiblerdivergence.Thiscorrespondstofindingrelevantdimensionsbymaximizingmutualinformation[2].Wenumericallytesthowtheseoptimizationschemesperformintheregimeoflowsignal-to-noisera-tio(smallnumberofspikesandincreasingneuralnoise)formodelvisualneurons.Wefindthatoptimizationschemesbasedoneitherleastsquarefittingorinforma-tionmaximizationperformwellevenwhennumberofspikesissmall.Informationmaximizationprovidesslightly,butsignificantly,betterreconstructionsthanleastsquarefitting.Thismakestheproblemoffindingrelevantdimensions,togetherwiththeproblemoflossycompression[3],oneofexampleswhereinformation-theoreticmeasuresarenomoredatalimitedthanthosederivedfromleastsquares.
1Introduction
Theapplicationofsystemidentificationtechniquestothestudyofsensoryneuralsystemshasalonghistory.Onefamilyofapproachesemploysthedimensionalityreductionidea:whileinputsaretypicallyveryhigh-dimensional,notalldimensionsareequallyimportantforelicitinganeuralresponse[4,5,6,7,8].Theaimisthentofindasmallsetofdimensions{eˆ1,eˆ2,...}inthestimulusspacethatarerelevantforneuralresponse,withoutimposing,however,aparticularfunctionalde-pendencebetweentheneuralresponseandthestimuluscomponents{s1,s2,...}alongtherelevantdimensions:
P(spike|s)=P(spike)g(s1,s2,...,sK),(1)IftheinputsareGaussian,thelastrequirementisnotimportant,becauserelevantdimensionscanbe
foundwithoutknowingacorrectfunctionalformforthenonlinearfunctionginEq.(1).However,fornon-Gaussianinputsawrongassumptionfortheformofthenonlinearitygwillleadtosystematicerrorsintheestimateoftherelevantdimensionsthemselves[9,5,1,2].ThelargerthedeviationsofthestimulusdistributionfromaGaussian,thelargerwillbetheeffectoferrorsinthepresumedformofthenonlinearityfunctiongonestimatingtherelevantdimensions.Becauseinputsderivedfromanaturalenvironment,eithervisualorauditory,havebeenshowntobestronglynon-Gaussian[10],we
willconcentratehereonsystemidentificationmethodssuitableforeitherGaussianornon-Gaussianstimuli.
Tofindtherelevantdimensionsforneuralresponsesprobedwithnon-Gaussianinputs,HunterandKorenbergproposedaniterativescheme[5]wheretherelevantdimensionsarefirstfoundbyassum-ingthattheinput–outputfunctiongislinear.Itsfunctionalformisthenupdatedgiventhecurrentestimateoftherelevantdimensions.Theinverseofgisthenusedtoimprovetheestimateoftherelevantdimensions.Thisprocedurecanbeimprovednottorelyoninvertingthenonlinearfunctiongbyformulatingoptimizationproblemexclusivelywithrespecttorelevantdimensions[1,2],wherethenonlinearfunctiongistakenintoaccountintheobjectivefunctiontobeoptimized.AfamilyofobjectivefunctionssuitableforfindingrelevantdimensionswithnaturalstimulihavebeenproposedbasedonR´enyidivergences[1]betweenthetheprobabilitydistributionsofstimuluscomponentsalongthecandidaterelevantdimensionscomputedwithrespecttoallinputsandthoseassociatedwithspikes.HereweshowthattheoptimizationproblembasedontheR´enyidivergenceoforder2correspondstoleastsquarefittingofthelinear-nonlinearmodeltoneuralspiketrains.TheKullback-LeiblerdivergencealsobelongstothisfamilyandistheR´enyidivergenceoforder1.Itquantifiestheamountofmutualinformationbetweentheneuralresponseandthestimuluscomponentsalongtherelevantdimension[2].Theoptimizationschemebasedoninformationmaximizationhasbeenpreviouslyproposedandimplementedonmodel[2]andrealcells[11].HerewederiveasymptoticerrorsforoptimizationstrategiesbasedonR´enyidivergencesofarbitraryorder,andshowthatrele-vantdimensionsfoundbymaximizingKullback-LeiblerdivergencehavethesmallesterrorsinthelimitoflargespikenumberscomparedtomaximizingotherR´enyidivergences,includingtheonewhichimplementsleastsquares.Wethenshowinnumericalsimulationsonmodelcellsthatthistrendpersistsevenforverylowspikenumbers.
2VarianceasanObjectiveFunction
Onewayofselectingalow-dimensionalmodelofneuralresponseistominimizeaχ2-differencebetweenspikeprobabilitiesmeasuredandpredictedbythemodelafteraveragingacrossallinputss:
2
P(spike|s)2
χ[v]=dsP(s),(2)
P(spike)wheredimensionvistherelevantdimensionforagivenmodeldescribedbyEq.(1)[multipledimensionscouldalsobeused,seebelow].UsingtheBayes’ruleandrearrangingterms,weget:
2
[P(s|spike)]2P(s|spike)2
=ds.(3)χ[v]=dsP(s)
P(s·v)Pv(x)
whereδ(x)isadelta-function.Inpractice,bothoftheaverages(4)arecalculatedbybiningthe
rangeofprojectionsvaluesxandcomputinghistogramsnormalizedtounity.Notethatiftheremultiplespikesaresometimeselicited,theprobabilitydistributionP(x|spike)canbeconstructedbyweightingthecontributionfromeachstimulusaccordingtothenumberofspikesitelicited.
Inthelastintegralaveraginghasbeencarriedoutwithrespecttoallstimuluscomponentsexceptforthosealongthetrialdirectionv,sothatintegrationvariablex=s·v.ProbabilitydistributionsPv(x)andPv(x|spike)representtheresultofthisaveragingacrossallpresentedstimuliandthosethatleadtoaspike,respectively:
Pv(x)=dsP(s)δ(x−s·v),Pv(x|spike)=dsP(s|spike)δ(x−s·v),(4)
Ifneuralspikesareindeedbasedononerelevantdimension,thenthisdimensionwillexplainallof
thevariance,leadingtoχ2=0.Forallotherdimensionsv,χ2[v]>0.BasedonEq.(3),inordertominimizeχ2weneedtomaximize
F[v]=
dxPv(x)
Pv(x|spike)
thetwoprobabilitydistributions(insteadofapowerαinaR´enyidivergenceoforderα)[12,13,1].ForoptimizationstrategybasedonR´enyidivergencesoforderα,therelevantdimensionsarefoundbymaximizing:
F(α)[v]=1
Pv(x)Bycomparison,whentherelevantdimension(s)arefoundbymaximizingα
.(6)
information[2],thegoalistomaximizeKullback-Leiblerdivergence,whichcanbeobtainedbytakingaformallimitα→1:
I[v]=dxPPv(x|spike)v(x)P(x)
=
dxPPv(x|spike)
v(x|spike)ln
vP(s)
.(8)
Itcorrespondstothevarianceinthefiringrateaveragedacrossdifferentinputs(seeEq.(9)below).Computationofthemutualinformationcarriedbytheindividualspikeaboutthestimulusreliesonsimilarintegrals.Followingtheprocedureoutlinedforcomputingmutualinformation[14],onecanusetheBayes’ruleandtheergodicassumptiontocomputeFmaxasatime-average:
2
F1
max=r¯,(9)
wherethefiringrater(t)=P(spike|s)/∆tismeasuredintimebinsofwidth∆tusingmultiple
repetitionsofthesamestimulussequence.Thestimulusensembleshouldbediverseenoughtojustifytheergodicassumption[thiscouldbecheckedbycomputingFdatasetsize].Theaveragefiringrater¯=P(spike)/∆tismaxforincreasingfractionsoftheoverallobtainedbyaveragingr(t)intime.
ThefactthatF[v] ∇α Pv(x|spike) vF(α)= dx to projectionsonalltherelevantdimensionswhenformingprobabilitydistributions(4).Forexample,inthecaseoftwodimensionsv1andv2,wewoulduse Pv1,v2(x1,x2|spike)=dsδ(x1−s·v1)δ(x2−s·v2)P(s|spike), (11)Pv1,v2(x1,x2)=dsδ(x1−s·v1)δ(x2−s·v2)P(s),computethevariancewithrespectto dx1dx2[P(x1,x2|spike)]2/P(x1,x2). the two dimensionsasF[v1,v2] = Ifmultiplestimulusdimensionsarerelevantforelicitingtheneuralresponse,theycanalwaysbefound(providedsufficientnumberofresponseshavebeenrecorded)byoptimizingthevarianceaccordingtoEq.(11)withthecorrectnumberofdimensions.Inpracticethisinvolvesfindingasinglerelevantdimensionfirst,andtheniterativelyincreasingthenumberofrelevantdimensionsconsideredwhileadjustingthepreviouslyfoundrelevantdimensions.Theamountbywhichrelevantdimensionsneedtobeadjustedisproportionaltothecontributionofsubsequentrelevantdimensionstoneuralspiking(thecorrespondingexpressionhasthesamefunctionalformasthatforrelevantdimensionsfoundbymaximizinginformation,cf.AppendixB[2]).IfstimuliareeitheruncorrelatedorcorrelatedbutGaussian,thenthepreviouslyfounddimensionsdonotneedtobeadjustedwhenadditionaldimensionsareintroduced.Alloftherelevantdimensionscanbefoundonebyone,byalwayssearchingonlyforasinglerelevantdimensioninthesubspaceorthogonaltotherelevantdimensionsalreadyfound. 3Illustrationforamodelsimplecell Hereweillustratehowrelevantdimensionscanbefoundbymaximizingvariance(equivalenttoleastsquarefitting),andcomparethisschemewiththatoffindingrelevantdimensionsbymaximizinginformation,aswellaswiththosethatarebaseduponcomputingthespike-triggeredaverage.Ourgoalistoreconstructrelevantdimensionsofneuronsprobedwithinputsofarbitrarystatistics.Weusedstimuliderivedfromanaturalvisualenvironment[11]thatareknowntostronglydeviatefromaGaussiandistribution.Allofthestudieshavebeencarriedoutwithrespecttomodelneurons.Advantageofdoingsoisthattherelevantdimensionsareknown.Theexamplemodelneuronistakentomimicpropertiesofsimplecellsfoundintheprimaryvisualcortex.Ithasasinglerelevantdimension,whichwewilldenoteaseˆ1.AscanbeseeninFig.1(a),itisphaseandorientationsensitive.Inthismodel,agivenstimulussleadstoaspikeiftheprojections1=s·eˆ1reachesathresholdvalueθinthepresenceofnoise:P(spike|s)/P(spike)≡g(s1)=H(s1−θ+ξ),whereaGaussianrandomvariableξwithvarianceσ2modelsadditivenoise,andthefunctionH(x)=1forx>0,andzerootherwise.Theparametersθforthresholdandthenoisevarianceσ2determinetheinput–outputfunction.Inwhatfollowswewillmeasuretheseparametersinunitsofthestandarddeviationofstimulusprojectionsalongtherelevantdimension.Intheseunits,thesignal-to-noiseratioisgivenbyσ. Figure1showsthatitispossibletoobtainagoodestimateoftherelevantdimensioneˆ1bymaxi-mizingeitherinformation,asshowninpanel(b),orvariance,asshowninpanel(c).Thefinalvalueoftheprojectiondependsonthesizeofthedataset,aswillbediscussedbelow.IntheexampleshowninFig.1therewere≈50,000spikeswithaverageprobabilityofspike≈0.05perframe,andthereconstructedvectorhasaprojectionvˆmax·eˆ1=0.98whenmaximizingeitherinformationorvariance.Havingestimatedtherelevantdimension,onecanproceedtosamplethenonlinearinput–outputfunction.ThisisdonebyconstructinghistogramsforP(s·vˆmax)andP(s·vˆmax|spike)ofprojectionsontovectorvˆmaxfoundbymaximizingeitherinformationorvariance,andtakingtheirratio.BecauseoftheBayes’rule,thisyieldsthenonlinearinput–outputfunctiongofEq.(1).InFig.1(d)thespikeprobabilityofthereconstructedneuronP(spike|s·vˆmax)(crosses)iscomparedwiththeprobabilityP(spike|s1)usedinthemodel(solidline).Agoodmatchisobtained.Inactuality,reconstructingevenjustonerelevantdimensionfromneuralresponsestocorrelatednon-Gaussianinputs,suchasthosederivedfromreal-world,isnotaneasyproblem.Thisfactcanbeappreciatedbyconsideringtheestimatesofrelevantdimensionobtainedfromthespike-triggeredaverage(STA)showninpanel(e).CorrectingtheSTAbysecond-ordercorrelationsoftheinputensemblethroughamultiplicationbytheinversecovariancematrixresultsinaverynoisyestimate, (a)truth(b) maximally informative dimension (c) dimension ofmaximal variance (d)spike probability1.0truth0.8information (x)0.6variance (x)0.40.20.0(h)spike probability-6-4-20246filtered stimulus (sd=1) maximizing10101020202030(e) 10STA 203030(f) 10203030(g) decorrelated STA 102030regularizeddecorrelated STA1.00.80.60.40.20.0 -6-4-20246filtered stimulus (sd=1) decorrelatedSTA (x)regularizeddecorrelatedSTA (x) 101010202020301020303030102030102030Figure1:Analysisofamodelvisualneuronwithonerelevantdimensionshownin(a).Panels(b)and(c)shownormalizedvectorsvˆmaxfoundbymaximizinginformationandvariance,respectively;(d)TheprobabilityofaspikeP(spike|s·vˆmax)(bluecrosses–informationmaximization,redcrosses–variancemaximization)iscomparedtoP(spike|s1)usedingeneratingspikes(solidline).Parametersofthemodelareσ=0.5andθ=2,bothgiveninunitsofstandarddeviationofs1,whichisalsotheunitsforthex-axisinpanels(dandh).Thespike–triggeredaverage(STA)isshown −1 in(e).Anattempttoremovecorrelationsaccordingtothereversecorrelationmethod,Capriorivsta(decorrelatedSTA),isshowninpanel(f)andinpanel(g)withregularization(seetext).Inpanel(h),thespikeprobabilitiesasafunctionofstimulusprojectionsontothedimensionsobtainedasdecorrelatedSTA(bluecrosses)andregularizeddecorrelatedSTA(redcrosses)arecomparedtoaspikeprobabilityusedtogeneratespikes(solidline). showninpanel(f).Ithasaprojectionvalueof0.25.Attempttoregularizetheinverseofcovariancematrixresultsinaclosermatchtothetruerelevantdimension[15,16,17,18,19]andhasaprojectionvalueof0.8,asshowninpanel(g).Whileitappearstobelessnoisy,theregularizeddecorrelatedSTAcanhavesystematicdeviationsfromthetruerelevantdimensions[9,20,2,11].Preferredorientationislesssusceptibletodistortionsthanthepreferredspatialfrequency[19].Inthiscaseregularizationwasperformedbysettingaside1/4ofthedataasatestdataset,andchoosingacutoffontheeigenvaluesoftheinputcovariancesmatrixthatwouldgivethemaximalinformationvalueonthetestdataset[16,19]. 4ComparisonofPerformancewithFiniteData Inthelimitofinfinitedatatherelevantdimensionscanbefoundbymaximizingvariance,informa-tion,orotherobjectivefunctions[1].Inarealexperiment,withadatasetoffinitesize,theoptimalvectorfoundbyanyoftheR´enyidivergencesvˆwilldeviatefromthetruerelevantdimensioneˆ1.InthissectionwecomparetherobustnessofoptimizationstrategiesbasedonR´enyidivergencesofvariousorders,includingleastsquaresfitting(α=2)andinformationmaximization(α=1),asthedatasetsizedecreasesand/orneuralnoiseincreases. Thedeviationfromthetruerelevantdimensionδv=vˆ−eˆ1arisesbecausetheprobabilitydistri-butions(4)areestimatedfromexperimentalhistogramsanddifferfromthedistributionsfoundinthelimitofinfinitedatasize.Theeffectsofnoiseonthereconstructioncanbecharacterizedbytakingthedotproductbetweentherelevantdimensionandtheoptimalvectorforaparticulardatasample:vˆ·eˆ1=1−1 divergenceofarbitraryorderwhenevaluatedalongtheoptimaldimensioneˆ1isgivenby P(x|spike)P(x|spike)(α) Hij=−αdxP(x|spike)Cij(x) dx P(x) Thereforeanexpectederrorinthereconstructionoftheoptimalfilterbymaximizingvarianceisinverselyproportionaltothenumberofspikes: vˆ·eˆ1≈1− 1 2Nspike , (14) 2α−4 d P(x) 2 .(13) whereweomittedsuperscripts(α)forclarity.Tr′denotesthetracetakeninthesubspaceorthogo-naltotherelevantdimension(deviationsalongtherelevantdimensionhavenomeaning[2],which mathematicallymanifestsitselfindimensioneˆ1beinganeigenvectorofmatricesHandBwiththezeroeigenvalue).Notethatwhenα=1,whichcorrespondstoKullback-Leiblerdivergenceandinformationmaximization,A≡Hα=1=Bα=1.Theasymptoticerrorsinthiscasearecompletely− ′2 determinedbythetraceoftheHessianofinformation,δv∝TrA1,reproducingtheprevi-ouslypublishedresultformaximallyinformativedimensions[2].Qualitatively,theexpectederror∼D/(2Nspike)increasesinproportiontothedimensionalityDofinputsanddecreasesasmorespikesarecollected.Thisdependenceisincommonwithexpectederrorsofrelevantdimensionsfoundbymaximizinginformation[2],aswellasmethodsbasedoncomputingthespike-triggeredaveragebothforwhitenoise[1,21,22]andcorrelatedGaussianinputs[2]. NextweexaminewhichoftheR´enyidivergencesprovidesthesmallestasymptoticerror(14)forestimatingrelevantdimensions.RepresentingthecovariancematrixasCij(x)=γik(x)γjk(x)(exactexpressionformatricesγwillnotbeneeded),wecanexpresstheHessianmatrixHandcovariancematrixforthegradientBasaverageswithrespecttoprobabilitydistributionP(x|spike): T B=dxP(x|spike)b(x)b(x),H=dxP(x|spike)a(x)bT(x),(15) α−2 MatrixAcorrespondstotheHessianofthemeritfunctionforα=1:A=H(α=1).Thus,amongthe variousoptimizationstrategiesbasedonR´enyidivergences,Kullback-Leiblerdivergence(α=1)hasthesmallestasymptoticerrors.TheleastsquarefittingcorrespondstooptimizationbasedonR´enyidivergencewithα=2,andisexpectedtohavelargererrorsthanoptimizationbasedonKullback-Leiblerdivergence(α=1)implementinginformationmaximization.ThisresultagreeswithrecentfindingsthatKullback-Leiblerdivergenceisthebestdistortionmeasureforperforminglossycompression[3]. Belowweusenumericalsimulationswithmodelcellstocomparetheperformanceofinformation(α=1)andvariance(α=2)maximizationstrategiesintheregimeofrelativelysmallnumbers wherethegainfunctiong(x)=P(x|spike)/P(x),andmatricesbij(x)=αγij(x)g′(x)[g(x)]andaij(x)=γij(x)g′(x)/g(x).Cauchy-Schwarzidentityforscalarquantitiesstatesthat,b2/ab2≥1/a2,wheretheaverageistakenwithrespecttosomeprobabilitydistribution.AsimilarresultcanalsobeprovenformatricesunderaTroperationasinEq.(14).Applyingthematrix-versionoftheCauchy-SchwarzidentitytoEq.(14),wefindthatthesmallesterrorisobtainedwhen ′′ Tr[BH−2]=Tr[A−1],withA=dxP(x|spike)a(x)aT(x),(16) 2.0D0.52.50D0.51.01.5D / Nspike 2.02.50.5Figure2:Projectionofvectorvˆmaxobtainedbymaximizinginformation(redfilledsymbols)orvariance(blueopensymbols)onthetruerelevantdimensioneˆ1isplottedasafunctionofratiobe-tweenstimulusdimensionalityDandthenumberofspikesNspike,withD=900.SimulationswerecarriedoutformodelvisualneuronswithonerelevantdimensionfromFig.1(a)andtheinput-outputfunctionEq.(1)describedbythresholdθ=2.0andnoisestandarddeviationσ=1.5,1.0,0.5,0.25forgroupslabeledA(△),B(▽),C(),andD(2),respectively.Theleftpanelalsoshowsresultsobtainedusingspike-triggeredaverage(STA,gray)anddecorrelatedSTA(dSTA,black).Intherightpanel,wereplotresultsforinformationandvarianceoptimizationtogetherwiththoseforregularizeddecorrelatedSTA(RdSTA,greenopensymbols).Allerrorbarsshowstandarddeviations. 5Conclusions Inthispaperwecomparedaccuracyofafamilyofoptimizationstrategiesforanalyzingneuralre-sponsestonaturalstimulibasedonR´enyidivergences.Findingrelevantdimensionsbymaximizingoneofthemeritfunctions,R´enyidivergenceoforder2,correspondstofittingthelinear-nonlinearmodelintheleast-squaresensetoneuralspiketrains.Advantageofthisapproachoverstandardleastsquarefittingprocedureisthatitdoesnotrequirethenonlineargainfunctiontobeinvertible.WederivederrorsexpectedforrelevantdimensionscomputedbymaximizingR´enyidivergencesofar-bitraryorderintheasymptoticregimeoflargespikenumbers.Thesmallesterrorswereachievednotinthecaseof(nonlinear)leastsquarefittingofthelinear-nonlinearmodeltotheneuralspiketrains(R´enyidivergenceoforder2),butwithinformationmaximization(basedonKullback-Leiblerdi-vergence).Numericsimulationsontheperformanceofbothinformationandvariancemaximizationstrategiesshowedthatbothalgorithmsperformedwellevenwhenthenumberofspikesisverysmall.Withsmallnumbersofspikes,reconstructionsbasedoninformationmaximizationhadalsoslightly,butsignificantly,smallererrorsthoseofleast-squarefitting.Thismakestheproblemoffindingrel-evantdimensions,togetherwiththeproblemoflossycompression[23,3],oneofexampleswhere information-theoreticmeasuresarenomoredatalimitedthanthosederivedfromleastsquares.Itremainspossible,however,thatothermeritfunctionsbasedonnon-polynomialdivergencemeasurescouldprovideevensmallerreconstructionerrorsthaninformationmaximization. References [1]L.Paninski.Convergencepropertiesofthreespike-triggeredaveragetechniques.Network:Comput. NeuralSyst.,14:437–4,2003. [2]T.Sharpee,N.C.Rust,andW.Bialek.Analyzingneuralresponsestonaturalsignals:Maximallyinforma-tiovedimensions.NeuralComputation,16:223–250,2004.Seealsophysics/0212110,andapreliminaryaccountinAdvancesinNeuralInformationProcessing15editedbyS.Becker,S.Thrun,andK.Ober-mayer,pp.261-268(MITPress,Cambridge,2003).[3]PeterHarremo¨esandNaftaliTishby.TheInformationbottleneckrevisitedorhowtochooseagood distortionmeasure.Proc.oftheIEEEInt.Symp.onInformationTheory(ISIT),2007. [4]E.deBoerandP.Kuyper.Triggeredcorrelation.IEEETrans.Biomed.Eng.,15:169–179,1968. [5]I.W.HunterandM.J.Korenberg.Theidentificationofnonlinearbiologicalsystems:WienerandHam-mersteincascademodels.Biol.Cybern.,55:135–144,1986. [6]R.R.deRuytervanSteveninckandW.Bialek.Real-timeperformanceofamovement-sensitiveneuronin theblowflyvisualsystem:codingandinformationtransferinshortspikesequences.Proc.R.Soc.Lond.B,265:259–265,1988. [7]V.Z.Marmarelis.ModelingMethodologyforNonlinearPhysiologicalSystems.Ann.Biomed.Eng., 25:239–251,1997. [8]W.BialekandR.R.deRuytervanSteveninck.Featuresanddimensions:Motionestimationinflyvision. q-bio/0505003,2005. [9]D.L.Ringach,G.Sapiro,andR.Shapley.Asubspacereverse-correlationtechniqueforhtestudyofvisual neurons.VisionRes.,37:2455–24,1997. [10]D.L.RudermanandW.Bialek.Statisticsofnaturalimages:scalinginthewoods.Phys.Rev.Lett., 73:814–817,1994. [11]T.O.Sharpee,H.Sugihara,A.V.Kurgansky,S.P.Rebrik,M.P.Stryker,andK.D.Miller.Adaptive filteringenhancesinformationtransmissioninvisualcortex.Nature,439:936–942,2006. [12]S.M.AliandS.D.Silvey.Ageneralclassofcoefficeintofdivergenceofonedistributionfromanother. J.R.Statist.Soc.B,28:131–142,1966.[13]I.Csisz´ar.Information-typemeasuresofdifferenceofprobabilitydistrbutionsandindirectobservations. StudiaSci.Math.Hungar.,2:299–318,1967. [14]N.Brenner,S.P.Strong,R.Koberle,W.Bialek,andR.R.deRuytervanSteveninck.Synergyinaneural code.NeuralComputation,12:1531–1552,2000.Seealsophysics/9902067. [15]F.E.Theunissen,K.Sen,andA.J.Doupe.Spectral-temporalreceptivefieldsofnonlinearauditory neuronsobtainedusingnaturalsounds.J.Neurosci.,20:2315–2331,2000. [16]F.E.Theunissen,S.V.David,N.C.Singh,A.Hsu,W.E.Vinje,andJ.L.Gallant.Estimatingspatio-temporal receptivefieldsofauditoryandvisualneuronsfromtheirresponsestonaturalstimuli.Network,3:2–316,2001. [17]K.Sen,F.E.Theunissen,andA.J.Doupe.Featureanalysisofnaturalsoundsinthesongbirdauditory forebrain.J.Neurophysiol.,86:1445–1458,2001. [18]D.Smyth,B.Willmore,G.E.Baker,I.D.Thompson,andD.J.Tolhurst.Thereceptivefieldsorganization ofsimplecellsintheprimaryvisualcortexofferretsundernaturalscenestimulation.J.Neurosci.,23:4746–4759,2003. [19]G.Felsen,J.Touryan,F.Han,andY.Dan.Corticalsensitivitytovisualfeaturesinnaturalscenes.PLoS Biol.,3:1819–1828,2005. [20]D.L.Ringach,M.J.Hawken,andR.Shapley.Receptivefieldstructureofneuronsinmonkeyvisual cortexrevealedbystimulationwithnaturalimagesequences.JournalofVision,2:12–24,2002. [21]N.C.Rust,O.Schwartz,J.A.Movshon,andE.P.Simoncelli.SpatiotemporalelementsofmacaqueV1 receptivefields.Neuron,46:945–956,2005. [22]SchwartzO.,J.W.Pillow,N.C.Rust,andE.P.Simoncelli.Spike-triggeredneuralcharacterization.Jour-nalofVision,176:484–507,2006. [23]N.Tishby,F.C.Pereira,andW.Bialek.Theinformationbottleneckmethod.InB.HajekandR.S.Sreeni-vas,editors,Proceedingsofthe37thAllertonConferenceonCommunication,ControlandComputing,pp368–377.UniversityofIllinois,1999.Seealsophysics/0004057. 因篇幅问题不能全部显示,请点此查看更多更全内容
Copyright © 2019- niushuan.com 版权所有 赣ICP备2024042780号-2
违法及侵权请联系:TEL:199 1889 7713 E-MAIL:2724546146@qq.com
本站由北京市万商天勤律师事务所王兴未律师提供法律服务