Ierarchical Emporal Emory: HTM Cortical Learning Algorithms
Ierarchical Emporal Emory: HTM Cortical Learning Algorithms
including
HTMCorticalLearningAlgorithms
VERSION0.1.1,NOVEMBER23,2010
Numenta,Inc.2010
UseofNumentassoftwareandintellectualproperty,includingtheideascontainedinthis
document,[Link],see
[Link]
Numenta2010 Page2
ReadThisFirst!
[Link]
shouldbeawareof.
WhatISinthisdocument:
Thisdocumentdescribesindetailnewalgorithmsforlearningandprediction
[Link]
[Link]
[Link]
someofourpastwhitepapers,thematerialintheintroductorychapterwillbe
[Link].
WhatisNOTinthisdocument:
Thereareseveraltopicsrelatedtotheimplementationofthesenewalgorithmsthat
didnotmakeitintothisearlydraft.
Althoughmostaspectsofthealgorithmshavebeenimplementedandtestedin
software,noneofthetestresultsarecurrentlyincluded.
Thereisnodescriptionofhowthealgorithmscanbeappliedtopracticalproblems.
Missingisadescriptionofhowyouwouldconvertdatafromasensorordatabase
intoadistributedrepresentationsuitableforthealgorithms.
[Link]
implementonlinelearninginsomerarercasesarenotdescribed.
[Link]
chaptersonthebiologicalbasisofthealgorithms,adiscussionofthepropertiesof
sparsedistributedrepresentations,andchaptersonapplicationsandexamples.
Wearemakingthisdocumentavailableinitscurrentformbecausewethinkthe
[Link]
shouldnotimpedeunderstandingandexperimentingwiththealgorithmsby
[Link]
progress.
Numenta2010 Page3
TableofContents
Preface 4
Chapter1:HTMOverview 7
Chapter2:HTMCorticalLearningAlgorithms 19
Chapter3:SpatialPoolingImplementationandPseudocode 34
Chapter4:TemporalPoolingImplementationandPseudocode 39
Glossary 47
AppendixA:AComparisonbetweenBiologicalNeurons 51
andHTMCells
Futurechapterswillincludeinformationonapplications,examples,comparisonto
othermodelsofmachinelearning,abibliography,achapteronthebiologicalbasis
ofHTM,andsparsedistributedrepresentations.
Numenta2010 Page4
Preface
Therearemanythingshumansfindeasytodothatcomputersarecurrentlyunable
[Link],understandingspokenlanguage,
recognizingandmanipulatingobjectsbytouch,andnavigatinginacomplexworld
[Link],wehavefewviable
algorithmsforachievinghumanlikeperformanceonacomputer.
Inhumans,[Link]
TemporalMemory(HTM)isatechnologymodeledonhowtheneocortexperforms
[Link]
exceedhumanlevelperformanceformanycognitivetasks.
ThisdocumentdescribesHTMtechnology.Chapter1providesabroadoverviewof
HTM,outliningtheimportanceofhierarchicalorganization,sparsedistributed
representations,andlearningtimebasedtransitions.Chapter2describestheHTM
corticallearningalgorithmsindetail.Chapters3and4providepseudocodeforthe
HTMlearningalgorithmsdividedintwopartscalledthespatialpoolerandtemporal
pooler.Afterreadingchapters2through4,experiencedsoftwareengineersshould
[Link],somereaders
willgofurtherandextendourwork.
Intendedaudience
[Link]
assumepriorknowledgeofneuroscience,wedoassumeyoucanunderstand
[Link]
[Link]
astudentincomputerscienceorcognitivescience,orasoftwaredeveloperwhois
interestedinbuildingartificialcognitivesystemsthatworkonthesameprinciples
asthehumanbrain.
Nontechnicalreaderscanstillbenefitfromcertainpartsofthedocument,
particularlyChapter1:HTMOverview.
Softwarerelease
Itisourintentiontoreleasesoftwarebasedonthealgorithmsdescribedinthis
documentinmid2011.
Numenta2010 Page5
Relationtopreviousdocuments
PartsofHTMtheoryaredescribedinthe2004bookOnIntelligence,inwhitepapers
publishedbyNumenta,andinpeerreviewedpaperswrittenbyNumenta
[Link],muchofwhich
[Link]
[Link]
algorithmsreplaceourfirstgenerationalgorithms,[Link],
wecalledthenewalgorithmsFixeddensityDistributedRepresentations,orFDR,
[Link]
CorticalLearningAlgorithms,orsometimesjusttheHTMLearningAlgorithms.
WeencourageyoutoreadOnIntelligence,writtenbyNumentacofounderJeff
[Link],
itprovidesaneasytoread,nontechnicalexplanationofHTMtheoryandthe
[Link],weunderstoodthe
basicprinciplesunderlyingHTMbutwedidntknowhowtoimplementthose
[Link]
startedinOnIntelligence.
AboutNumenta
Numenta,Inc.([Link])wasformedin2005todevelopHTMtechnology
[Link]
[Link]
[Link]
structuredoursoftwaretoencouragetheemergenceofanindependent,application
[Link]
[Link],licensing
software,[Link]
alwayswillseektomakeourdeveloperpartnerssuccessful,aswellasbesuccessful
ourselves.
NumentaisbasedinRedwoodCity,[Link].
Abouttheauthors
[Link]
theprincipalauthorsforeachsectionarelistedintherevisionhistory.
Numenta2010 Page6
Revisionhistory
[Link]
clarificationsorformattingchangesarenotnoted.
Numenta2010 Page7
Chapter1:HTMOverview
HierarchicalTemporalMemory(HTM)isamachinelearningtechnologythataimsto
capturethestructuralandalgorithmicpropertiesoftheneocortex.
[Link]
vision,hearing,touch,movement,language,andplanningareallperformedbythe
[Link],youmightexpectthe
neocortextoimplementanequallydiversesuiteofspecializedneuralalgorithms.
[Link]
[Link]
commonsetofalgorithmstoperformmanydifferentintelligencefunctions.
HTMprovidesatheoreticalframeworkforunderstandingtheneocortexandits
[Link]
[Link],[Link]
webelievewehaveimplementedasufficientsubsetofwhattheneocortexdoesto
beofcommercialandscientificvalue.
[Link]
computers,[Link]
contrast,[Link]
HTMscapabilitiesaredeterminedlargelybywhatithasbeenexposedto.
[Link],anysystemthat
triestomodelthearchitecturaldetailsoftheneocortexisaneuralnetwork.
However,onitsown,thetermneuralnetworkisnotveryusefulbecauseithas
[Link](calledcellswhen
referringtoHTM),whicharearrangedincolumns,inlayers,inregions,andina
[Link],andinthisregardHTMsareanewformofneural
network.
Asthenameimplies,[Link]
aretrainedonlotsoftimevaryingdata,andrelyonstoringalargesetofpatterns
[Link]
[Link]
[Link]
implementanykindofdataorganizationandstructureontopoftheflatcomputer
[Link],
[Link]
[Link].A
userofanHTMspecifiesthesizeofthehierarchyandwhattotrainthesystemon,
buttheHTMcontrolswhereandhowinformationisstored.
Numenta2010 Page8
AlthoughHTMnetworksaresubstantiallydifferentthanclassiccomputing,wecan
usegeneralpurposecomputerstomodelthemaslongasweincorporatethekey
functionsofhierarchy,timeandsparsedistributedrepresentations(describedin
detaillater).Webelievethatovertime,specializedhardwarewillbecreatedto
generatepurposebuiltHTMnetworks.
Inthisdocument,weoftenillustrateHTMpropertiesandprinciplesusingexamples
drawnfromhumanvision,touch,hearing,language,[Link]
[Link],itisimportantto
[Link]
tononhumansensoryinputstreams,suchasradarandinfrared,ortopurely
informationalinputstreamssuchasfinancialmarketdata,weatherdata,Webtraffic
patterns,[Link]
manytypesofproblems.
HTMprinciples
Inthissection,wecoversomeofthecoreprinciplesofHTM:whyhierarchical
organizationisimportant,howHTMregionsarestructured,whydataisstoredas
sparsedistributedrepresentations,andwhytimebasedinformationiscritical.
Hierarchy
[Link]
unitofmemoryandpredictioninanHTM,andwillbediscussedindetailinthenext
[Link],[Link]
ascendthehierarchythereisalwaysconvergence,multipleelementsinachild
[Link],duetofeedback
connections,informationalsodivergesasyoudescendthehierarchy.(Aregion
[Link]
theinternalfunctionofaregion,whereasweusethewordlevelwhenreferring
specificallytotheroleoftheregionwithinthehierarchy.)
Numenta2010 Page9
Figure 1.1: Simplified diagram of four HTM regions arranged in a four-level hierarchy,
communicating information within levels, between levels, and to/from outside the hierarchy
[Link]
[Link],one
networkmightbeprocessingauditoryinformationandanothernetworkmightbe
[Link],
withtheseparatebranchesconvergingonlytowardsthetop.
[Link]
trainingtimeandmemoryusagebecausepatternslearnedateachlevelofthe
[Link]
illustration,[Link],yourbrain
storesinformationabouttinysectionsofthevisualfieldsuchasedgesandcorners.
[Link]
patternsarerecombinedatmidlevelsintomorecomplexcomponentssuchas
[Link],thetopofasteeringwheelor
[Link]
highlevelobjectfeatures,suchasheads,[Link]
objectyoudonthavetorelearnitscomponents.
Numenta2010 Page10
Asanotherexample,considerthatwhenyoulearnanewword,youdontneedto
relearnletters,syllables,orphonemes.
Sharingrepresentationsinahierarchyalsoleadstogeneralizationofexpected
[Link],ifyouseeamouthandteethyouwillpredict
[Link]
enablesanewobjectintheworldtoinherittheknownpropertiesofitssub
components.
HowmuchcanasinglelevelinanHTMhierarchylearn?Orputanotherway,how
manylevelsinthehierarchyarenecessary?Thereisatradeoffbetweenhowmuch
[Link],
HTMsautomaticallylearnthebestpossiblerepresentationsateachlevelgiventhe
[Link]
memorytoalevel,thatlevelwillformrepresentationsthatarelargerandmore
complex,[Link]
allocatelessmemory,alevelwillformrepresentationsthataresmallerandsimpler,
whichinturnmeansmorehierarchicallevelsmaybeneeded.
Uptothispointwehavebeendescribingdifficultproblems,suchasvisioninference
(inferenceissimilartopatternrecognition).Butmanyvaluableproblemsare
simplerthanvision,[Link],
weappliedanHTMtopredictingwhereapersonbrowsingawebsiteislikelyto
[Link]
[Link],thesolutionmostly
requireddiscoveringthetemporalstatistics,[Link]
[Link]
HTMsareidealforsuchproblems.
Insummary,hierarchiesreducetrainingtime,reducememoryusage,andintroduce
[Link],manysimplerpredictionproblemscanbesolved
withasingleHTMregion.
Regions
[Link]
[Link]
differentareasorregionsprimarilybasedonhowtheregionsconnecttoeach
[Link]
[Link]
toregionconnectivitythatdefinesthehierarchy.
[Link]
areinthehierarchy,[Link]
2mmthicknessofaneocorticalregion,youwillseesixlayers,fivelayersofcellsand
Numenta2010 Page11
onenoncellularlayer(thereareafewexceptionsbutthisisthegeneralrule).Each
layerinaneocorticalregionhasmanyinterconnectedcellsarrangedincolumns.
HTMregionsalsoarecomprisedofasheetofhighlyinterconnectedcellsarrangedin
columns.Layer3inneocortexisoneoftheprimaryfeedforwardlayersof
[Link]
3inaregionoftheneocortex.
Figure 1.3: A section of an HTM region. HTM regions are comprised of many cells. The cells
are organized in a two dimensional array of columns. This figure shows a small section of an
HTM region with four cells per column. Each column connects to a subset of the input and each
cell connects to other cells in the region (connections not shown). Note that this HTM region,
including its columnar structure, is equivalent to one layer of neurons in a neocortical region.
AlthoughanHTMregionisequivalenttoonlyaportionofaneocorticalregion,itcan
doinferenceandpredictiononcomplexdatastreamsandthereforecanbeusefulin
manyproblems.
SparseDistributedRepresentations
Althoughneuronsintheneocortexarehighlyinterconnected,inhibitoryneurons
[Link],
informationinthebrainisalwaysrepresentedbyasmallpercentageofactive
[Link]
[Link]
[Link]
[Link]
conveyssomemeaningbutitmustbeinterpretedwithinthecontextofapopulation
ofneuronstoconveythefullmeaning.
[Link],thememory
mechanismswithinanHTMregionaredependentonusingsparsedistributed
representations,[Link]
alwaysadistributedrepresentation,butitmaynotbesparse,sothefirstthingan
HTMregiondoesistoconvertitsinputintoasparsedistributedrepresentation.
Numenta2010 Page12
Forexample,aregionmightreceive20,[Link]
[Link]
5,0001bitsandanothertimetheremightbe9,[Link]
convertthisinputintoaninternalrepresentationof10,000bitsofwhich2%,or
200,areactiveatonce,[Link]
inputtotheHTMregionvariesovertime,theinternalrepresentationalsowill
change,buttherealwayswillbeabout200bitsoutof10,000active.
Itmayseemthatthisprocessgeneratesalargelossofinformationasthenumberof
possibleinputpatternsismuchgreaterthanthenumberofpossiblerepresentations
[Link],[Link]
[Link]
[Link]
informationwillnothaveapracticaleffect.
Sparsedistributedrepresentationshaveseveraldesirablepropertiesandare
[Link].
Theroleoftime
Timeplaysacrucialroleinlearning,inference,andprediction.
[Link],wecaninferalmostnothingfromour
[Link]
anappleinyourhand,youcanidentifywhatitisaftermanipulatingitforjusta
[Link],althoughthetactile
informationisconstantlychanging,theobjectitselftheapple,aswellasyourhigh
[Link],ifanapplewasplacedonyour
outstretchedpalm,andyouwerentallowedtomoveyourhandorfingers,you
wouldhavegreatdifficultyidentifyingitasanappleratherthanalemon.
Numenta2010 Page13
[Link]
apple,orthecrunchingsoundsofsomeonebitingintoanapple,canonlybe
recognizedfromthedozensorhundredsofrapid,sequentialchangesovertimeof
thesoundspectrum.
Vision,incontrast,[Link],humansareable
torecognizeimageswhentheyareflashedinfrontofthemtoofasttogivetheeyesa
[Link],visualinferencedoesnotalwaysrequiretimechanging
[Link],duringnormalvisionweconstantlymoveoureyes,headsand
bodies,[Link]
quickvisualexposureisaspecialcasemadepossiblebythestatisticalpropertiesof
[Link],hearing,andtouchisthat
inferencerequirestimechanginginputs.
Havingcoveredthegeneralcaseofinference,andthespecialcaseofvisioninference
ofstaticimages,[Link],allHTMsystemsmustbe
[Link],wherestatic
inferenceissometimespossible,wemustseechangingimagesofobjectstolearn
[Link],[Link]
eachinstanceintimethedogcausesapatternofactivityontheretinainyoureye.
Youperceivethesepatternsasdifferentviewsofthesamedog,butmathematically
[Link]
[Link],
teachingyouwhichspatialpatternsgotogether.
[Link]
[Link]
[Link]
humansensesasexamples,[Link]
wewanttotrainanHTMtorecognizepatternsfromapowerplantstemperature,
vibrationandnoisesensors,theHTMwillneedtobetrainedondatafromthose
sensorschangingthroughtime.
Typically,[Link]
identifydogsbyseeingmanyinstancesofmanybreedsofdogs,notjustonesingle
[Link]
sequencesfromastreamofinputdata,[Link]
[Link]
startandend,theremaybeoverlappingsequencesoccurringatthesametime,
learninghastooccurcontinuously,andlearninghastooccurinthepresenceof
noise.
[Link]
HTMlearnswhatpatternsarelikelytofollowotherpatterns,itcanpredictthelikely
Numenta2010 Page14
nextpattern(s)[Link]
coveredinmoredetaillater.
WenowwillturntothefourbasicfunctionsofHTM:learning,inference,prediction,
[Link]:learning,
inference,[Link],however,[Link]
thatmostneocorticalregionshavearoleincreatingbehaviorbutwedonotbelieve
[Link]
[Link]
completeness.
Learning
AnHTMregionlearnsaboutitsworldbyfindingpatternsandthensequencesof
[Link];it
[Link]
togetheroften,[Link]
patternsappearinsequenceovertime,whichwecalltemporalpatternsor
sequences.
Iftheinputtotheregionrepresentsenvironmentalsensorsonabuilding,theregion
mightdiscoverthatcertaincombinationsoftemperatureandhumidityonthenorth
sideofthebuildingoccuroftenandthatdifferentcombinationsoccuronthesouth
[Link]
aseachdaypasses.
Iftheinputtoaregionrepresentedinformationrelatedtopurchaseswithinastore,
theHTMregionmightdiscoverthatcertaintypesofarticlesarepurchasedon
weekends,orthatwhentheweatheriscoldcertainpricerangesarefavoredinthe
[Link]
patternsintheirpurchases.
[Link]
whatitlearnsbasedonhowmuchmemoryithasandthecomplexityoftheinputit
[Link]
[Link]
[Link]
patternsinaregionaresimple,thenahierarchyofregionsmaybeneededto
[Link]
wheretheneocorticalregionreceivinginputfromtheretinalearnsspatialpatterns
[Link]
patternscombineandrepresentmostorallofthevisualspace.
Numenta2010 Page15
Likeabiologicalsystem,thelearningalgorithmsinanHTMregionarecapableof
onlinelearning,[Link]
foralearningphaseseparatefromaninferencephase,thoughinferenceimproves
[Link],theHTMregionwill
graduallychange,too.
Afterinitialtraining,anHTMcancontinuetolearnor,alternatively,learningcanbe
[Link]
[Link]
HTMhaslearnedthebasicstatisticalstructureofitsworld,mostnewlearning
[Link]
thathavepreviouslyunseenlowlevelstructure,itwilltakelongerfortheHTMto
[Link]
[Link],ifyoutrytolearnnew
wordsfromaforeignlanguagewithunfamiliarsounds,youllfinditmuchharder
becauseyoudontalreadyknowthelowlevelsounds.
[Link]
highlevelpatternsinmarketfluctuations,disease,weather,manufacturingyield,or
failuresofcomplexsystems,suchaspowergrids,[Link],
learningspatialandtemporalpatternsismostlyaprecursortoinferenceand
prediction.
Inference
AfteranHTMhaslearnedthepatternsinitsworld,itcanperforminferenceon
[Link],itwillmatchittopreviouslylearned
[Link]
storedsequencesistheessenceofinferenceandpatternmatching.
[Link]
[Link]
[Link],four,ormorenotesbeforeyourecognize
[Link]
[Link]
regioncanfindmatchesfromthebeginningofsequencesbutusuallyitismorefluid,
[Link]
HTMregionsusedistributedrepresentations,theregionsuseofsequencememory
andinferencearemorecomplicatedthanthemelodyexampleimplies,butthe
examplegivesaflavorforhowitworks.
Itmaynotbeimmediatelyobvious,buteverysensoryexperienceyouhaveeverhad
hasbeennovel,[Link]
example,youcanunderstandthewordbreakfastspokenbyalmostanyone,no
Numenta2010 Page16
matterwhethertheyareoldoryoung,maleorfemale,arespeakingquicklyor
slowly,[Link]
breakfastahundredtimes,thesoundwouldneverstimulateyourcochleae
(auditoryreceptors)inexactlythesamewaytwice.
AnHTMregionfacesthesameproblemyourbraindoes:inputsmayneverrepeat
[Link],justlikeyourbrain,anHTMregionmusthandlenovelinput
[Link]
[Link]
distributedrepresentationsisthatyouonlyneedtomatchaportionofthepattern
tobeconfidentthatthematchissignificant.
Prediction
[Link]
sequenceswithcurrentinput,aregionformsapredictionaboutwhatinputswill
[Link]
[Link]
sequence,suchasthenotesinamelody,butinthegeneralcasemanypossible
[Link]
[Link]
memoryinanHTMisdedicatedtosequencememory,orstoringtransitions
betweenspatialpatterns.
FollowingaresomekeypropertiesofHTMprediction.
1)Predictioniscontinuous.
Withoutbeingconsciousofit,[Link].
Whenlisteningtoasong,[Link]
stairs,[Link]
baseballpitcherthrow,[Link]
anHTMregion,[Link]
notaseparatestepbutintegraltothewayanHTMregionworks.
2)Predictionoccursineveryregionateverylevelofthehierarchy.
IfyouhaveahierarchyofHTMregions,[Link]
[Link],
lowerlevelregionsmightpredictpossiblenextphonemes,andhigherlevelregions
mightpredictwordsorphrases.
3)Predictionsarecontextsensitive.
Predictionsarebasedonwhathasoccurredinthepast,aswellaswhatisoccurring
[Link].
AnHTMregionlearnstouseasmuchpriorcontextasneeded,andcankeepthe
Numenta2010 Page17
[Link]
[Link],thinkaboutamemorizedspeechsuchasthe
[Link],knowingjustthecurrentwordis
rarelysufficient;thewordandisfollowedbysevenandlaterbydedicatedjust
[Link],justalittlebitofcontextwillhelpprediction;
[Link],thereare
repetitivephrases,andonewouldneedtousethecontextofafarlongertimeframe
toknowwhereyouareinthespeech,andthereforewhatcomesnext.
4)Predictionleadstostability.
[Link]
outputsofregionsbecomemorestablethatisslowerchanging,longerlasting
[Link]
[Link],
[Link]
[Link],thenewlypredictedstepchangesbutthefourof
[Link],eventhougheachnew
inputiscompletelydifferent,onlyapartoftheoutputischanging,makingoutputs
[Link]
world,wherehighlevelconceptssuchasthenameofasongchangemoreslowly
thanlowlevelconceptstheactualnotesofthesong.
5)Apredictiontellsusifanewinputisexpectedorunexpected.
[Link]
occurnext,[Link]
manypossiblenextinputssimultaneously,[Link]
predictexactlywhatwillhappennext,butifthenextinputdoesntmatchanyofthe
predictionstheHTMregionwillknowthatananomalyhasoccurred.
6)Predictionhelpsmakethesystemmorerobusttonoise.
WhenanHTMpredictswhatislikelytohappennext,thepredictioncanbiasthe
[Link],ifanHTMwereprocessing
spokenlanguage,itwouldpredictwhatsounds,words,andideasarelikelytobe
[Link]
soundarrives,theHTMwillinterpretthesoundbasedonwhatitisexpecting,thus
helpinginferenceeveninthepresenceofnoise.
InanHTMregion,sequencememory,inference,andpredictionareintimately
[Link].
Numenta2010 Page18
Behavior
[Link],ourretina
[Link]
[Link]
inputandmotorbehaviorareintimatelyentwined.
Fordecadestheprevailingviewwasthatasingleregionintheneocortex,the
primarymotorregion,waswheremotorcommandsoriginatedintheneocortex.
Overtimeitwasdiscoveredthatmostorallregionsintheneocortexhaveamotor
output,[Link]
sensoryandmotorfunctions.
WeexpectthatamotoroutputcouldbeaddedtoeachHTMregionwithinthe
currentlyexistingframeworksincegeneratingmotorcommandsissimilarto
[Link],alltheimplementationsofHTMstodatehavebeen
purelysensory,withoutamotorcomponent.
ProgresstowardtheimplementationofHTM
WehavemadesubstantialprogressturningtheHTMtheoreticalframeworkintoa
[Link]
[Link]
wetestthealgorithmsonnewdatasets,wewillrefinethealgorithmsandadd
[Link]
describethecurrentstateofthealgorithms.
Therearemanycomponentsofthetheorythatarenotyetimplemented,including
attention,feedbackbetweenregions,specifictiming,andbehavior/sensorymotor
[Link]
created.
Numenta2010 Page19
Chapter2:HTMCorticalLearningAlgorithms
ThischapterdescribesthelearningalgorithmsatworkinsideanHTMregion.
Chapters3and4describetheimplementationofthelearningalgorithmsusing
pseudocode,whereasthischapterismoreconceptual.
Terminology
Beforewegetstarted,[Link]
[Link]
cells,synapses,potentialsynapses,dendritesegments,andcolumnsareused
[Link]
[Link],inthe
processofimplementingthealgorithmswewereconfrontedwithperformance
issuesandthereforeoncewefeltweunderstoodhowsomethingworkedwewould
[Link]
[Link]
[Link],ifyouarefamiliarwith
neuroscienceterms,youmightfindyourselfconfusedasouruseoftermsvaries
[Link]
theHTMlearningalgorithms,butrightnowitwillbehelpfultomentionafewofthe
deviationsthatarelikelytocausethemostconfusion.
Cellstates
HTMcellshavethreeoutputstates,activefromfeedforwardinput,activefrom
lateralinput,[Link]
[Link],steadyrateof
[Link]
[Link]
distributedrepresentationsseemstoovercometheneedtomodelscalaractivity
ratesincells.
Dendritesegments
HTMcellshavearelativelyrealistic(andthereforecomplex)[Link]
theoryeachHTMcellhasoneproximaldendritesegmentandadozenortwodistal
[Link]
[Link]
inhibitorycellsforcesallthecellsinacolumntorespondtosimilarfeedforward
[Link],weremovedtheproximaldendritesegmentfromeachcelland
[Link]
poolerfunction(describedbelow)operatesontheshareddendritesegment,atthe
[Link]
segments,[Link]
Numenta2010 Page20
achievesthesamefunctionality,thoughinbiologythereisnoequivalenttoa
dendritesegmentattachedtoacolumn.
Synapses
[Link]
theyarealsopartiallystochastic,suggestingabiologicalneuroncannotrelyon
[Link]
modelofdendriteoperationallowsustoassignbinaryweightstoHTMsynapses
[Link]
[Link]
[Link]
[Link]
[Link]
[Link]
[Link],therangewouldgofromcompletely
unconnected,tostartingtoformasynapsebutnotconnectedyet,toaminimally
connectedsynapse,[Link]
[Link]
[Link]
threshold,[Link],itis
unconnectedwithaweightof0.
Overview
[Link]
[Link]
[Link]
[Link]?
[Link]
[Link]
itsmemoryofsequences,[Link]
descriptionmakesitsoundeasy,[Link]
downalittlefurtherintothefollowingthreesteps:
1)Formasparsedistributedrepresentationoftheinput
2)Formarepresentationoftheinputinthecontextofpreviousinputs
3)Formapredictionbasedonthecurrentinputinthecontextofpreviousinputs
Wewilldiscusseachofthesestepsinmoredetail.
Numenta2010 Page21
1)Formasparsedistributedrepresentationoftheinput
Whenyouimagineaninputtoaregion,[Link]
[Link]
bitswillbeactive(value1)andotherswillbeinactive(value0).Thepercentageof
inputbitsthatareactivevary,sayfrom0%to60%.ThefirstthinganHTMregion
[Link],
theinputmighthave40%ofitsbitsonbutthenewrepresentationhasjust2%of
itsbitson.
[Link]
ofoneormorecells.Columnsmaybelogicallyarrangedina2Darraybutthisisnot
[Link]
bits(usuallyoverlappingwithothercolumnsbutneverexactlythesamesubsetof
inputbits).Asaresult,differentinputpatternsresultindifferentlevelsofactivation
[Link],ordeactivate,the
columnswithweakeractivation.(Theinhibitionoccurswithinaradiusthatcan
spanfromverylocaltotheentireregion.)Thesparserepresentationoftheinputis
[Link]
inhibitionfunctionisdefinedtoachievearelativelyconstantpercentageofcolumns
tobeactive,evenwhenthenumberofinputbitsthatareactivevariessignificantly.
Figure 2.1: An HTM region consists of columns of cells. Only a small portion of a region is shown.
Each column of cells receives activation from a unique subset of the input. Columns with the
strongest activation inhibit columns with weaker activation. The result is a sparse distributed
representation of the input (active columns are shown in light grey).
[Link],some
columnswillreceiveafewmoreorafewlessinputsintheonstate,butthesetof
[Link](onesthat
haveasignificantnumberofactivebitsincommon)willmaptoarelativelystable
[Link]
[Link]
later.
Numenta2010 Page22
Allthesesteps(learningtheconnectionstoeachcolumnfromasubsetoftheinputs,
determiningthelevelofinputtoeachcolumn,andusinginhibitiontoselectasparse
setofactivecolumns)[Link]
patternsthatarespatiallysimilar(meaningtheysharealargenumberofactive
bits)arepooled(meaningtheyaregroupedtogetherinacommon
representation).
2)Formarepresentationoftheinputinthecontextofpreviousinputs
Thenextfunctionperformedbyaregionistoconvertthecolumnarrepresentation
oftheinputintoanewrepresentationthatincludesstate,orcontext,fromthepast.
Thenewrepresentationisformedbyactivatingasubsetofthecellswithineach
column,typicallyonlyonecellpercolumn.
Considerhearingtwospokensentences,[Link]
wordsateandeightarehomonyms;[Link]
thatatsomepointinthebrainthereareneuronsthatrespondidenticallytothe
[Link],identicalsoundsareenteringtheear.
However,wealsocanbecertainthatatanotherpointinthebraintheneuronsthat
respondtothisinputaredifferent,[Link]
[Link]
youhavememorizedthetwosentencesIateapearandIhaveeightpears.
[Link]
differentinternalrepresentationsafterhearingIateandIhaveeight.
Thisprincipleofencodinganinputdifferentlyindifferentcontextsisauniversal
featureofperceptionandactionandisoneofthemostimportantfunctionsofan
[Link].
[Link]
[Link],we
[Link]
examplemighthelp.Sayeverycolumnhas4cellsandtherepresentationofevery
[Link],
wehave4^[Link]
alwaysresultinthesame100columnsbeingactive,butindifferentcontexts
[Link]
inaverylargenumberofcontexts,buthowuniquewillthosedifferent
representationsbe?Nearlyallrandomlychosenpairsofthe4^100possible
[Link]
inputindifferentcontextswillhaveabout25cellsincommonand75cellsthatare
different,makingthemeasilydistinguishable.
[Link]
active,[Link]
Numenta2010 Page23
ormorecellsinthecolumnareinthepredictivestate,onlythosecellswillbecome
active,therestofthecellsinthecolumnremaininactive.
Ifthereisnopriorstate,andthereforenocontextandprediction,allthecellsina
[Link]
[Link]
whatwillhappennext;[Link]
doesnotmatchwhatisexpected,allthecellsintheactivecolumnwillbecome
[Link]
matchormismatchisneveranallornothingevent.
Figure 2.2: By activating a subset of cells in each column, an HTM region can represent the same
input in many different contexts. Columns only activate predicted cells. Columns with no
predicted cells activate all the cells in the column. The figure shows some columns with one cell
active and some columns with all cells active.
.
Asmentionedintheterminologysectionabove,HTMcellscanbeinoneofthree
[Link]
thecellisactiveduetolateralconnectionstoothernearbycellswesayitisinthe
predictivestate.
3)Formapredictionbasedontheinputinthecontextofpreviousinputs
Thefinalstepforourregionistomakeapredictionofwhatislikelytohappennext.
Thepredictionisbasedontherepresentationformedinstep2),whichincludes
contextfromallpreviousinputs.
Whenaregionmakesapredictionitactivates(intothepredictivestate)allthecells
[Link]
representationsinaregionaresparse,multiplepredictionscanbemadeatthesame
time.Forexampleif2%ofthecolumnsareactiveduetoaninput,youcouldexpect
thattendifferentpredictionscouldbemaderesultingin20%ofthecolumnshaving
[Link],twentydifferentpredictionscouldbemaderesultingin40%of
Numenta2010 Page24
[Link],withoneactiveat
atime,then10%ofthecellswouldbeinthepredictivestate.
Afuturechapteronsparsedistributedrepresentationswillshowthateventhough
differentpredictionsaremergedtogether,aregioncanknowwithhighcertainty
whetheraparticularinputwaspredictedornot.
Howdoesaregionmakeaprediction?Wheninputpatternschangeovertime,
[Link]
active,itformsconnectionstoasubsetofthecellsnearbythatwereactive
[Link]
[Link],allacellneedstodoisto
[Link],
thecellcanexpectthatitmightbecomeactiveshortlyandentersapredictivestate.
Thusthefeedforwardactivationofasetofcellswillleadtothepredictive
[Link]
whenyourecognizeasongandstartpredictingthenextnotes.
Figure 2.3: At any point in time, some cells in an HTM region will be active due to feed-forward
input (shown in light gray). Other cells that receive lateral input from active cells will be in a
predictive state (shown in dark gray).
Insummary,whenanewinputarrives,itleadstoasparsesetofactivecolumns.
Oneormoreofthecellsineachcolumnbecomeactive,theseinturncauseother
cellstoenterapredictivestatethroughlearnedconnectionsbetweencellsinthe
[Link]
[Link],itselects
[Link],
meaningitwasnotpredictedbyanycells,itwillactivateallthecellsinthecolumns.
Ifanewlyactivecolumnhasoneormorepredictedcells,onlythosecellswill
[Link],
includingthecellsactivebecauseoffeedforwardinputandthecellsactiveinthe
predictivestate.
Numenta2010 Page25
Asmentionedearlier,[Link]
[Link]
example,anHTMregionwouldnotjustpredictthenextnoteinamelody,butmight
[Link]
region(theunionofalltheactiveandpredictedcellsinaregion)changesmore
[Link]
[Link],B,C,D,E,F,[Link]
hearingthefirsttwonotes,theregionrecognizesthesequenceandstartspredicting.
ItpredictsC,D,E,[Link],C,D,E,Fareallinone
[Link]
andpredictivecellsnowrepresentsC,D,E,F,[Link]
completelygoingfromBtoC,butonly20%ofthecellschanged.
BecausetheoutputofanHTMregionisavectorrepresentingtheactivityofallthe
regionscells,[Link]
ahierarchicalarrangementofregions,wewillseeanincreaseintemporalstability
asyouascendthehierarchy.
Weusethetermtemporalpoolertodescribethetwostepsofaddingcontextto
[Link]
sequencesofpatterns,weareinessencepoolingtogetherdifferentpatternsthat
followeachotherintime.
[Link]
[Link]
uniquetothespatialpoolerfollowedbyconceptsanddetailsuniquetothetemporal
pooler.
Sharedconcepts
[Link]
casesinvolvesestablishingconnections,orsynapses,[Link]
[Link]
learnsfeedforwardconnectionsbetweeninputbitsandcolumns.
Binaryweights
HTMsynapseshaveonlya0or1effect;theirweightisbinary,apropertyunlike
manyneuralnetworkmodelswhichusescalarvariablevaluesintherangeof0to1.
Permanence
[Link]
valuetoeachsynapse(0.0to1.0)toindicatehowpermanenttheconnectionis.
Whenaconnectionisreinforced,[Link]
Numenta2010 Page26
conditions,[Link]
threshold(e.g.0.2),[Link]
isbelowthethreshold,thesynapsewillhavenoeffect.
Dendritesegments
[Link].
[Link]
synapsesonthistypeofsegmentarelinearlysummedtodeterminethefeed
forwardactivationofacolumn.
Theothertypeofdendritesegmentformssynapseswithcellswithintheregion.
[Link]
synapsesonthistypeofsegmentexceedsathreshold,thentheassociatedcell
[Link]
cell,acellspredictivestateisthelogicalORoperationofseveralconstituent
thresholddetectors.
PotentialSynapses
[Link]
[Link]
[Link]
permanencevalueandmaybecomefunctionalsynapsesiftheirpermanencevalues
exceedathreshold.
Learning
Learninginvolvesincrementingordecrementingthepermanencevaluesof
[Link]
[Link],ifa
postsynapticcellisactiveduetoadendritesegmentreceivinginputaboveits
threshold,thenthepermanencevaluesofthesynapsesonthatsegmentare
[Link],andthereforecontributedtothecellbeing
active,[Link],andtherefore
didnotcontribute,[Link]
whichsynapsepermanencevaluesareupdateddifferinthespatialandtemporal
[Link].
Nowwewilldiscussconceptsspecifictothespatialandtemporalpoolerfunctions.
Spatialpoolerconcepts
Themostfundamentalfunctionofthespatialpooleristoconvertaregionsinput
[Link]
learnsequencesandmakepredictionsrequiresstartingwithsparsedistributed
patterns.
Numenta2010 Page27
Thereareseveraloverlappinggoalsforthespatialpooler,whichdeterminehowthe
spatialpooleroperatesandlearns.
1)Useallcolumns
AnHTMregionhasafixednumberofcolumnsthatlearntorepresentcommon
[Link]
[Link]
[Link],wekeeptrack
[Link]
columnistoolow,itboostsitsinputactivityleveluntilitstartstobepartofthe
[Link],allcolumnsarecompetingwiththeirneighbors
[Link],it
[Link],othercolumnswillbeforcedtomodify
theirinputandstartrepresentingslightlydifferentinputpatterns.
2)Maintaindesireddensity
[Link]
[Link]
thesizeofthereceptivefieldsofthecolumns(andthereforecanrangefromsmallto
thesizeoftheentireregion).Withintheradiusofinhibition,weallowonlya
[Link]
remaindersofthecolumnsaredisabled.(Aradiusofinhibitionimpliesa2D
arrangementofcolumns,buttheconceptcanbeadaptedtoothertopologies.)
3)Avoidtrivialpatterns
[Link]
canbeachievedbysettingaminimumthresholdofinputforthecolumntobeactive.
Forexample,ifwesetthethresholdto50,itmeansthatacolumnmusthavealeast
50activesynapsesonitsdendritesegmenttobeactive,guaranteeingacertainlevel
ofcomplexitytothepatternitrepresents.
4)Avoidextraconnections
Ifwearentcareful,[Link]
[Link]
[Link],we
decrementthepermanencevalueofanysynapsethatisntcurrentlycontributingto
[Link]
penalized,weguaranteeacolumnrepresentsalimitednumberinputpatterns,
sometimesonlyone.
5)Selfadjustingreceptivefields
Realbrainsarehighlyplastic;regionsoftheneocortexcanlearntorepresent
[Link]
damaged,otherpartswilladjusttorepresentwhatthedamagedpartusedto
[Link],theassociatedpartofthe
[Link].
Numenta2010 Page28
WewantourHTMregionstoexhibitthesameflexibility.Ifweallocate10,000
columnstoaregion,itshouldlearnhowtobestrepresenttheinputwith10,000
columns.Ifweallocate20,000columns,itshouldlearnhowbesttousethat
[Link],thecolumnsshouldchangetobestrepresent
[Link],thedesignerofanHTMshouldbeabletoallocateany
resourcestoaregionandtheregionwilldothebestjobitcanofrepresentingthe
[Link]
withmorecolumnsinaregion,eachcolumnwillrepresentlargerandmoredetailed
[Link],yetwewill
maintainarelativeconstantsparsitylevel.
[Link]
boostinginactivecolumns,inhibitingneighboringcolumnstomaintainconstant
sparsity,establishingminimalthresholdsforinput,maintainingalargepoolof
potentialsynapses,andaddingandforgettingsynapsesbasedontheircontribution,
theensembleofcolumnswilldynamicallyconfiguretoachievethedesiredeffect.
Spatialpoolerdetails
Wecannowgothrougheverythingthespatialpoolingfunctiondoes.
1)[Link]
representsensorydataortheymightcomefromanotherregionlowerinthe
hierarchy.
2)[Link]
[Link]
[Link]
[Link],someofthepotential
synapseswillbevalid.
3)Foranygiveninput,determinehowmanyvalidsynapsesoneachcolumnare
connectedtoactiveinputbits.
4)Thenumberofactivesynapsesismultipliedbyaboostingfactorwhichis
dynamicallydeterminedbyhowoftenacolumnisactiverelativetoitsneighbors.
5)Thecolumnswiththehighestboostedinputdisableallbutafixedpercentageof
[Link]
determinedbythespread(orfanout)[Link]
activecolumns.
Numenta2010 Page29
6)Foreachoftheactivecolumns,weadjustthepermanencevaluesofallthe
[Link]
[Link]
[Link]
synapsesfrombeingvalidtonotvalid,andviceversa.
Temporalpoolerconcepts
[Link]
methodisthatwhenacellbecomesactive,itformsconnectionstoothercellsthat
[Link]
[Link],collectivelytheycanstoreand
recallsequences,[Link]
centralstorageforasequenceofpatterns;instead,memoryisdistributedamongthe
[Link],thesystemisrobusttonoise
[Link],usuallywithlittleornodiscernibleeffect.
Itisworthnotingafewimportantpropertiesofsparsedistributedrepresentations
thatthetemporalpoolerexploits.
Assumewehaveahypotheticalregionthatalwaysformsrepresentationsbyusing
200activecellsoutofatotalof10,000cells(2%ofthecellsareactiveatanytime).
Howcanwerememberandrecognizeaparticularpatternof200activecells?A
[Link]
[Link],whatifwemade
alistofonly20ofthe200activecellsandignoredtheother180?Whatwould
happen?Youmightthinkthatrememberingonly20cellswouldcauselotsoferrors,
[Link]
[Link](inthisexample200activecells
outof10,000),remembering20activecellsisalmostasgoodasrememberingall
[Link].
[Link]
[Link]
formstheseconnectionsasameansofrecognizingthestateofthenetworkatsome
[Link]
[Link]
segmentsees15ofthoseactivecells,itcanbefairlycertainthelargerpatternis
[Link]
algorithms.
Everycellparticipatesinmanydifferentdistributedpatternsandinmanydifferent
[Link]
[Link],notjustone.
Numenta2010 Page30
Ideallyacellwouldhaveonedendritesegmentforeachpatternofactivityitwants
[Link],adendritesegmentcanlearnconnectionsfor
[Link],onesegment
mightlearn20connectionsforeachof4differentpatterns,foratotalof80
[Link]
[Link]
connectionsfromdifferentpatternsmightincorrectlycombinetoexceedthe
[Link],thiskindoferrorisveryunlikely,againduetothesparseness
oftherepresentations.
Nowwecanseehowacellwithoneortwodozendendritesegmentsandafew
thousandsynapsescanrecognizehundredsofseparatestatesofcellactivity.
Temporalpoolerdetails
[Link]
thespatialpoolerleftoff,withasetofactivecolumnsrepresentingthefeedforward
input.
1)Foreachactivecolumn,checkforcellsinthecolumnthatareinapredictivestate,
[Link],activateallthecellsinthe
[Link]
contextofpriorinput.
2)Foreverydendritesegmentoneverycellintheregion,counthowmany
[Link]
threshold,[Link]
segmentsareputinthepredictivestateunlesstheyarealreadyactiveduetofeed
[Link]
[Link]
predictionoftheregion.
3)Whenadendritesegmentbecomesactive,modifythepermanencevaluesofall
[Link]
dendritesegment,increasethepermanenceofthosesynapsesthatareconnectedto
activecellsanddecrementthepermanenceofthosesynapsesconnectedtoinactive
[Link].
Thismodifiesthesynapsesonsegmentsthatarealreadytrainedsufficientlytomake
thesegmentactive,[Link],wealwayswantto
[Link],wepickaseconddendrite
[Link]
[Link],
usingthestateofthesystemintheprevioustimestep,increasethepermanenceof
Numenta2010 Page31
thosesynapsesthatareconnectedtoactivecellsanddecrementthepermanenceof
[Link]
aremarkedastemporary.
4)Wheneveracellswitchesfrombeinginactivetoactiveduetofeedforwardinput,
wetraverseeachpotentialsynapseassociatedwiththecellandremoveany
[Link]
correctlypredictedthefeedforwardactivationofthecell.
5)Whenacellswitchesfromeitheractivestatetoinactive,undoanypermanence
[Link]
tostrengthenthepermanenceofsynapsesthatincorrectlypredictedthefeed
forwardactivationofacell.
Notethatonlycellsthatareactiveduetofeedforwardinputpropagateactivity
withintheregion,[Link]
theactivecells(feedforwardandpredictive)formtheoutputofaregionand
propagatetothenextregioninthehierarchy.
Firstorderversusvariableordersequencesandprediction
Thereisonemoremajortopictodiscussbeforeweendourdiscussiononthespatial
[Link]
understandChapters3and4.
Whatistheeffectofhavingmoreorfewercellspercolumn?Specifically,what
happensifwehaveonlyonecellpercolumn?Intheexampleusedearlier,we
showedthatarepresentationcomprisedof100activecolumnswith4cellsper
columncanbeencodedin4^100differentways,creatingaverybignumber.
[Link]
inputpatternrepresentedaword,thenaregioncouldremembermanysentences
[Link]
[Link]
ofmemoryiscalledvariableorder,meaningthattheamountofpriorcontext
[Link].
Ifweincreasetofivecellspercolumn,theavailablenumberofencodingsofany
particularinputinourexamplewouldincreaseto5^100,ahugeincreaseover
4^[Link]
increaseincapacitymightnotbeuseful.
However,makingthenumberofcellspercolumnmuchsmallerdoesmakeabig
difference.
Numenta2010 Page32
Ifwegoallthewaytoonecellpercolumn,welosetheabilitytoincludecontextin
[Link],
[Link],thememoryofanHTM
regionisafirstordermemory;predictionsarebasedonlyonthecurrentinput.
Firstorderpredictionisideallysuitedforonetypeofproblemthatbrainssolve:
[Link],ahumanexposedtoabriefvisualimage
canrecognizewhattheobjectiseveniftheexposureistooshortfortheeyesto
[Link],youalwaysneedtohearasequenceofpatternstorecognize
[Link],youusuallyprocessastreamofvisualimages.
Butundercertainconditionsyoucanrecognizeanimagewithasingleexposure.
Temporalandstaticrecognitionmightappeartorequiredifferentinference
[Link]
[Link]
[Link]
cellspercolumnisideallysuitedforrecognizingtimebasedsequences,andanHTM
[Link]
Numenta,wehaveperformedmanyexperimentsusingonecellpercolumnregions
[Link]
ofthischapter;howeverwewillcovertheimportantconcepts.
IfweexposeanHTMregiontoimages,thecolumnsintheregionlearntorepresent
[Link]
whatisobservedinregionV1inneocortex(aneocorticalregionextensivelystudied
inbiology),[Link]
movingimages,[Link]
example,averticallineatonepositionisoftenfollowedbyaverticallineshiftedto
[Link]
bytheHTMregion.
Nowwhathappensifweexposearegiontoanimageofaverticallinemovingtothe
right?Ifourregionhasonlyonecellpercolumn,itwillpredictthelinemightnext
[Link]
[Link]
[Link]
predictiveoutputofsuchacellwillbeactiveforavisiblelineindifferentpositions,
[Link]
observedthataregionlikethisexhibitsstabilitytotranslation,changesinscale,etc.
[Link]
iswhatisneededforspatialinvariance(recognizingthesamepatternindifferent
locationsofanimage).
IfwenowdothesameexperimentonanHTMregionwithmultiplecellspercolumn,
[Link]
Numenta2010 Page33
outputofacellwillbeactiveforalinemovingtotheleftoralinemovingtothe
right,butnotboth.
Puttingthisalltogether,[Link]
[Link]
[Link]
theyallhavesharedcolumnarresponsepropertiesandlargehorizontalconnectivity
[Link]
[Link]
[Link]
anatomicalstudiesthatlayer6createsfeedbackinthehierarchyandlayer5is
[Link]
4and3.Wespeculatethatoneofthedifferencesbetweenlayers4and3isthatthe
cellsinlayer4areactingindependently,[Link],whereasthecellsin
[Link]
[Link]
sequencememory(roughlycorrespondingtolayer4neurons)isusefulinforming
[Link]
memory(roughlycorrespondingtolayer3neurons)isusefulforinferenceand
predictionofmovingimages.
Insummary,wehypothesizethatthealgorithmssimilartothosedescribedinthis
[Link]
neocortexvaryinsignificantdetailswhichmakethemplaydifferentrolesrelatedto
[Link],attention,[Link]
sensoryinput,itisusefultohavealayerofneuronsperformingfirstordermemory
asthisleadstospatialinvariance.
AtNumenta,wehaveexperimentedwithfirstorder(singlecellpercolumn)HTM
[Link]
order(multiplecellspercolumn)HTMregionsforrecognizingandpredicting
[Link],itwouldbelogicaltotrytocombinethese
[Link],we
believemanyinterestingproblemscanbeaddressedwiththeequivalentofsingle
layer,multiplecellpercolumnregions,eitheraloneorinahierarchy.
Numenta2010 Page34
Chapter3:SpatialPoolingImplementationandPseudocode
Thischaptercontainsthedetailedpseudocodeforafirstimplementationofthe
[Link]
[Link]
activeColumns(t)thelistofcolumnsthatwinduetothebottomupinputattimet.
Thislististhensentasinputtothetemporalpoolerroutinedescribedinthenext
chapter,[Link](t)istheoutputofthespatialpoolingroutine.
Thepseudocodeissplitintothreedistinctphasesthatoccurinsequence:
Phase1:computetheoverlapwiththecurrentinputforeachcolumn
Phase2:computethewinningcolumnsafterinhibition
Phase3:updatesynapsepermanenceandinternalvariables
Althoughspatialpoolerlearningisinherentlyonline,youcanturnofflearningby
simplyskippingPhase3.
[Link]
variousdatastructuresandsupportingroutinesusedinthecodearedefinedatthe
end.
Initialization
Priortoreceivinganyinputs,theregionisinitializedbycomputingalistofinitial
[Link]
[Link]
[Link]
[Link],thevaluesarechosentobeinasmallrangearoundconnectedPerm
(theminimumpermanencevalueatwhichasynapseisconsidered"connected").
Thisenablespotentialsynapsestobecomeconnected(ordisconnected)aftera
[Link],eachcolumnhasanaturalcenterover
theinputregion,andthepermanencevalueshaveabiastowardsthiscenter(they
havehighervaluesnearthecenter).
Numenta2010 Page35
Phase1:Overlap
Givenaninputvector,thefirstphasecalculatestheoverlapofeachcolumnwiththat
[Link]
withactiveinputs,[Link],weset
theoverlapscoretozero.
1. for c in columns
2.
3. overlap(c) = 0
4. for s in connectedSynapses(c)
5. overlap(c) = overlap(c) + input(t, [Link])
6.
7. if overlap(c) < minOverlap then
8. overlap(c) = 0
9. else
10. overlap(c) = overlap(c) * boost(c)
Phase2:Inhibition
Thesecondphasecalculateswhichcolumnsremainaswinnersaftertheinhibition
[Link]
[Link],ifdesiredLocalActivityis10,acolumnwillbea
winnerifitsoverlapscoreisgreaterthanthescoreofthe10'thhighestcolumn
withinitsinhibitionradius.
Phase3:Learning
Thethirdphaseperformslearning;itupdatesthepermanencevaluesofallsynapses
asnecessary,aswellastheboostandinhibitionradius.
[Link],ifa
synapseisactive,itspermanencevalueisincremented,otherwiseitisdecremented.
Permanencevaluesareconstrainedtobebetween0and1.
[Link]
[Link](as
measuredbyactiveDutyCycle),itsoverallboostvalueisincreased(line3032).
Alternatively,ifacolumn'sconnectedsynapsesdonotoverlapwellwithanyinputs
oftenenough(asmeasuredbyoverlapDutyCycle),itspermanencevaluesare
boosted(line3436).Note:oncelearningisturnedoff,boost(c)isfrozen.
Finally,attheendofPhase3theinhibitionradiusisrecomputed(line38).
Numenta2010 Page37
Supportingdatastructuresandroutines
Thefollowingvariablesanddatastructuresareusedinthepseudocode:
columns Listofallcolumns.
input(t,j) [Link](t,j)is1ifthej'th
inputison.
overlap(c) Thespatialpooleroverlapofcolumncwithaparticular
inputpattern.
activeColumns(t) Listofcolumnindicesthatarewinnersduetobottomup
input.
desiredLocalActivity Aparametercontrollingthenumberofcolumnsthatwillbe
winnersaftertheinhibitionstep.
inhibitionRadius Averageconnectedreceptivefieldsizeofthecolumns.
neighbors(c) AlistofallthecolumnsthatarewithininhibitionRadiusof
columnc.
minOverlap Aminimumnumberofinputsthatmustbeactivefora
columntobeconsideredduringtheinhibitionstep.
boost(c) Theboostvalueforcolumncascomputedduringlearning
usedtoincreasetheoverlapvalueforinactivecolumns.
synapse Adatastructurerepresentingasynapsecontainsa
permanencevalueandthesourceinputindex.
connectedPerm Ifthepermanencevalueforasynapseisgreaterthanthis
value,itissaidtobeconnected.
potentialSynapses(c) Thelistofpotentialsynapsesandtheirpermanencevalues.
connectedSynapses(c) AsubsetofpotentialSynapses(c)wherethepermanence
[Link]
bottomupinputsthatarecurrentlyconnectedtocolumnc.
permanenceInc Amountpermanencevaluesofsynapsesareincremented
duringlearning.
permanenceDec Amountpermanencevaluesofsynapsesaredecremented
duringlearning.
activeDutyCycle(c) Aslidingaveragerepresentinghowoftencolumnchas
beenactiveafterinhibition(e.g.overthelast1000
iterations).
Numenta2010 Page38
overlapDutyCycle(c) Aslidingaveragerepresentinghowoftencolumnchashad
significantoverlap([Link])withits
inputs(e.g.overthelast1000iterations).
minDutyCycle(c) Avariablerepresentingtheminimumdesiredfiringratefor
[Link]'sfiringratefallsbelowthisvalue,itwillbe
boosted.Thisvalueiscalculatedas1%ofthemaximum
firingrateofitsneighbors.
Thefollowingsupportingroutinesareusedintheabovecode.
kthScore(cols, k)
Giventhelistofcolumns,returnthek'thhighestoverlapvalue.
updateActiveDutyCycle(c)
Computesamovingaverageofhowoftencolumnchasbeenactiveafter
inhibition.
updateOverlapDutyCycle(c)
Computesamovingaverageofhowoftencolumnchasoverlapgreater
thanminOverlap.
averageReceptiveFieldSize()
Theradiusoftheaverageconnectedreceptivefieldsizeofallthecolumns.
Theconnectedreceptivefieldsizeofacolumnincludesonlytheconnected
synapses(thosewithpermanencevalues>=connectedPerm).Thisisused
todeterminetheextentoflateralinhibitionbetweencolumns.
maxDutyCycle(cols)
Returnsthemaximumactivedutycycleofthecolumnsinthegivenlistof
columns.
increasePermanences(c, s)
Increasethepermanencevalueofeverysynapseincolumncbyascale
factors.
boostFunction(c)
[Link]>=[Link]
activeDutyCyle(c)isaboveminDutyCycle(c),[Link]
boostincreaseslinearlyoncethecolumn'sactiveDutyCylestartsfalling
belowitsminDutyCycle.
Numenta2010 Page39
Chapter4:TemporalPoolingImplementationandPseudocode
Thischaptercontainsthedetailedpseudocodeforafirstimplementationofthe
[Link](t),ascomputed
[Link]
cellatthecurrenttimestep,[Link]
eachcellformstheoutputofthetemporalpoolerforthenextlevel.
Thepseudocodeissplitintothreedistinctphasesthatoccurinsequence:
Phase1:computetheactivestate,activeState(t),foreachcell
Phase2:computethepredictedstate,predictiveState(t),foreachcell
Phase3:updatesynapses
[Link],unlikespatialpooling,Phases1and
[Link]
temporalpoolingissignificantlymorecomplicatedthanspatialpooling,wefirstlist
theinferenceonlyversionofthetemporalpooler,followedbyaversionthat
[Link]
details,terminology,andsupportingroutinesareattheendofthechapter,afterthe
pseudocode.
Numenta2010 Page40
Temporalpoolerpseudocode:inferencealone
Phase1
[Link]
[Link]
byanycell(i.e.itspredictiveStatewas1duetoasequencesegmentintheprevious
timestep),thenthosecellsbecomeactive(lines49).Ifthebottomupinputwas
unexpected([Link]),theneachcellinthecolumn
becomesactive(lines1113).
1. for c in activeColumns(t)
2.
3. buPredicted = false
4. for i = 0 to cellsPerColumn - 1
5. if predictiveState(c, i, t-1) == true then
6. s = getActiveSegment(c, i, t-1, activeState)
7. if [Link] == true then
8. buPredicted = true
9. activeState(c, i, t) = 1
10.
11. if buPredicted == false then
12. for i = 0 to cellsPerColumn - 1
13. activeState(c, i, t) = 1
Phase2
[Link]
predictiveStateifanyoneofitssegmentsbecomesactive,[Link]
horizontalconnectionsarecurrentlyfiringduetofeedforwardinput.
Numenta2010 Page41
Temporalpoolerpseudocode:combinedinferenceandlearning
Phase1
ThefirstphasecalculatestheactiveStateforeachcellthatisinawinningcolumn.
Forthosecolumns,thecodefurtherselectsonecellpercolumnasthelearningcell
(learnState).Thelogicisasfollows:ifthebottomupinputwaspredictedbyanycell
(i.e.itspredictiveStateoutputwas1duetoasequencesegment),thenthosecells
becomeactive(lines510).Ifthatsegmentbecameactivefromcellschosenwith
learnStateon,thiscellisselectedasthelearningcell(lines1113).Ifthebottomup
inputwasnotpredicted,thenallcellsinthebecomeactive(lines1517).In
addition,thebestmatchingcellischosenasthelearningcell(lines1924)andanew
segmentisaddedtothatcell.
18. for c in activeColumns(t)
19.
20. buPredicted = false
21. lcChosen = false
22. for i = 0 to cellsPerColumn - 1
23. if predictiveState(c, i, t-1) == true then
24. s = getActiveSegment(c, i, t-1, activeState)
25. if [Link] == true then
26. buPredicted = true
27. activeState(c, i, t) = 1
28. if segmentActive(s, t-1, learnState) then
29. lcChosen = true
30. learnState(c, i, t) = 1
31.
32. if buPredicted == false then
33. for i = 0 to cellsPerColumn - 1
34. activeState(c, i, t) = 1
35.
36. if lcChosen == false then
37. i = getBestMatchingCell(c, t-1)
38. learnState(c, i, t) = 1
39. sUpdate = getSegmentActiveSynapses (c, i, -1, t-1, true)
40. [Link] = true
41. [Link](sUpdate)
Numenta2010 Page42
Phase2
[Link]
predictivestateoutputifoneofitssegmentsbecomesactive,[Link]
[Link],thecell
queuesupthefollowingchanges:a)reinforcementofthecurrentlyactivesegment
(lines3031),andb)reinforcementofasegmentthatcouldhavepredictedthis
activation,[Link](potentiallyweak)matchtoactivityduringthe
previoustimestep(lines3336).
Phase3
[Link]
updatesthathavebeenqueuedupareactuallyimplementedoncewegetfeed
forwardinputandthecellischosenasalearningcell(lines3840).Otherwise,ifthe
celleverstopspredictingforanyreason,wenegativelyreinforcethesegments
(lines4143).
Numenta2010 Page43
Implementationdetailsandterminology
Inthissectionwedescribesomeofthedetailsofourtemporalpooler
[Link]:a
columnindex,c,andacellindex,[Link],where
eachsegmentcontainsalistofsynapsesplusapermanencevalueforeachsynapse.
Changestoacell'ssynapsesaremarkedastemporaryuntilthecellbecomesactive
[Link]
[Link],sequenceSegment,
indicatingwhetherthesegmentpredictsfeedforwardinputonthenexttimestep.
Theimplementationofpotentialsynapsesisdifferentfromtheimplementationin
[Link],thecompletelistofpotentialsynapsesis
[Link],eachsegmentcanhaveits
own(possiblylarge)[Link]
[Link]
temporalpooler,werandomlyaddactivesynapsestoeachsegmentduringlearning
(controlledbytheparameternewSynapseCount).Thisoptimizationhasasimilar
effecttomaintainingthefulllistofpotentialsynapses,butthelistpersegmentisfar
smallerwhilestillmaintainingthepossibilityoflearningnewtemporalpatterns.
Thepseudocodealsousesasmallstatemachinetokeeptrackofthecellstatesat
[Link]
activeStateandpredictiveStatekeeptrackoftheactiveandpredictivestatesofeach
[Link]
[Link],allthecellsinaparticularcolumn
[Link](thecellthatbest
matchestheinput)[Link]
thathavelearnStatesettoone(thisavoidsoverrepresentingafullyactivecolumnin
dendriticsegments).
Numenta2010 Page44
Thefollowingdatastructuresareusedinthetemporalpoolerpseudocode:
cell(c,i) Alistofallcells,indexedbyiandc.
cellsPerColumn Numberofcellsineachcolumn.
activeColumns(t) Listofcolumnindicesthatarewinnersduetobottomup
input(thisistheoutputofthespatialpooler).
activeState(c, i, t) [Link]
activestateofthecolumnccelliattimetgiventhecurrent
feedforwardinputandthepasttemporalcontext.
activeState(c,i,t)isthecontributionfromcolumnccelliat
timet.If1,thecellhascurrentfeedforwardinputaswellas
anappropriatetemporalcontext.
predictiveState(c, i, t) [Link]
predictionofthecolumnccelliattimet,giventhebottomup
activityofothercolumnsandthepasttemporalcontext.
predictiveState(c,i,t)isthecontributionofcolumnccelliat
timet.If1,thecellispredictingfeedforwardinputinthe
currenttemporalcontext.
learnState(c, i, t) Abooleanindicatingwhethercelliincolumncischosenas
thecelltolearnon.
activationThreshold [Link]
connectedsynapsesinasegmentisgreaterthan
activationThreshold,thesegmentissaidtobeactive.
learningRadius Theareaaroundatemporalpoolercellfromwhichitcanget
lateralconnections.
initialPerm Initialpermanencevalueforasynapse.
connectedPerm Ifthepermanencevalueforasynapseisgreaterthanthis
value,itissaidtobeconnected.
minThreshold Minimumsegmentactivityforlearning.
newSynapseCount Themaximumnumberofsynapsesaddedtoasegmentduring
learning.
permanenceInc Amountpermanencevaluesofsynapsesareincremented
whenactivitybasedlearningoccurs.
permanenceDec Amountpermanencevaluesofsynapsesaredecremented
whenactivitybasedlearningoccurs.
Numenta2010 Page45
segmentUpdate Datastructureholdingthreepiecesofinformationrequiredto
updateagivensegment:a)segmentindex(1ifit'sanew
segment),b)alistofexistingactivesynapses,andc)aflag
indicatingwhetherthissegmentshouldbemarkedasa
sequencesegment(defaultstofalse).
segmentUpdateList [Link](c,i)is
thelistofchangesforcelliincolumnc.
Thefollowingsupportingroutinesareusedintheabovecode:
segmentActive(s, t, state)
Thisroutinereturnstrueifthenumberofconnectedsynapsesonsegment
sthatareactiveduetothegivenstateattimetisgreaterthan
[Link],or
learnState.
getActiveSegment(c, i, t, state)
Forthegivencolumnccelli,returnasegmentindexsuchthat
segmentActive(s,t,state)[Link],sequence
[Link],segmentswithmostactivity
aregivenpreference.
getBestMatchingSegment(c, i, t)
Forthegivencolumnccelliattimet,findthesegmentwiththelargest
[Link]
[Link]
[Link]
activationThreshold,[Link]
[Link],thenanindexof1is
returned.
getBestMatchingCell(c)
Forthegivencolumn,returnthecellwiththebestmatchingsegment(as
definedabove).Ifnocellhasamatchingsegment,thenreturnthecellwith
thefewestnumberofsegments.
Numenta2010 Page46
Numenta2010 Page47
Glossary
Notes:Definitionsherecapturehowtermsareusedinthis
document,andmayhaveothermeaningsingeneraluse.
Capitalizedtermsrefertootherdefinedtermsinthis
glossary.
ActiveState astateinwhichCellsareactiveduetoFeedForward
input
BottomUp synonymtoFeedForward
Cells HTMequivalentofaNeuron
CellsareorganizedintocolumnsinHTMregions.
CoincidentActivity twoormoreCellsareactiveatthesametime
Column agroupofoneormoreCellsthatfunctionasaunit
inanHTMRegion
Cellswithinacolumnrepresentthesamefeedforward
input,butindifferentcontexts.
DendriteSegment aunitofintegrationofSynapsesassociatedwithCellsand
Columns
[Link]
[Link]
numberofactivesynapsesonthedendritesegmentexceeds
athreshold,theassociatedcellentersthepredictivestate.
Theotherisassociatedwithfeedforwardconnectionstoa
[Link]
generatethefeedforwardactivationofacolumn.
DesiredDensity desiredpercentageofColumnsactiveduetoFeed
ForwardinputtoaRegion
Thepercentageonlyapplieswithinaradiusthatvaries
[Link]
becausethepercentagevariessomebasedonthe
particularinput.
Numenta2010 Page48
FeedForward movinginadirectionawayfromaninput,orfroma
lowerLeveltoahigherLevelinaHierarchy(sometimes
calledBottomUp)
Feedback movinginadirectiontowardsaninput,orfromahigher
LeveltoalowerlevelinaHierarchy(sometimescalled
TopDown)
FirstOrderPrediction apredictionbasedonlyonthecurrentinputandnoton
thepriorinputscomparetoVariableOrderPrediction
HierarchicalTemporal
Memory(HTM)
atechnologythatreplicatessomeofthestructuraland
algorithmicfunctionsoftheneocortex
Hierarchy anetworkofconnectedelementswheretheconnections
betweentheelementsareuniquelyidentifiedasFeed
ForwardorFeedback
HTMCorticalLearning
Algorithms
thesuiteoffunctionsforSpatialPooling,Temporal
Pooling,andlearningandforgettingthatcomprisean
HTMRegion,alsoreferredtoasHTMLearning
Algorithms
HTMNetwork aHierarchyofHTMRegions
HTMRegion themainunitofmemoryandPredictioninanHTM
AnHTMregioniscomprisedofalayerofhighly
[Link]
todayhasasinglelayerofcells,whereasintheneocortex
(andultimatelyinHTM),aregionwillhavemultiplelayers
[Link]
hierarchy,aregionmaybereferredtoasalevel.
Inference recognizingaspatialandtemporalinputpatternas
similartopreviouslylearnedpatterns
InhibitionRadius
definestheareaaroundaColumnthatitactivelyinhibits
LateralConnections connectionsbetweenCellswithinthesameRegion
Level anHTMRegioninthecontextoftheHierarchy
Numenta2010 Page49
Neuron aninformationprocessingCellinthebrain
Inthisdocument,weusethewordneuronspecificallywhen
referringtobiologicalcells,andcellwhenreferringtothe
HTMunitofcomputation.
Permanence ascalarvaluewhichindicatestheconnectionstateofa
PotentialSynapse
Apermanencevaluebelowathresholdindicatesthe
[Link]
[Link]
HTMregionisaccomplishedbymodifyingpermanence
valuesofpotentialsynapses.
PotentialSynapse thesubsetofallCellsthatcouldpotentiallyform
SynapseswithaparticularDendriteSegment
Onlyasubsetofpotentialsynapseswillbevalidsynapsesat
anytimebasedontheirpermanencevalue.
Prediction activatingCells(intoapredictivestate)thatwilllikely
becomeactiveinthenearfutureduetoFeedForward
input
AnHTMregionoftenpredictsmanypossiblefutureinputs
atthesametime.
ReceptiveField thesetofinputstowhichaColumnorCellisconnected
IftheinputtoanHTMregionisorganizedasa2Darrayof
bits,thenthereceptivefieldcanbeexpressedasaradius
withintheinputspace.
Sensor asourceofinputsforanHTMNetwork
SparseDistributed
Representation
representationcomprisedofmanybitsinwhichasmall
percentageareactiveandwherenosinglebitissufficient
toconveymeaning
Numenta2010 Page50
SpatialPooling theprocessofformingasparsedistributed
representationofaninput
Oneofthepropertiesofspatialpoolingisthatoverlapping
inputpatternsmaptothesamesparsedistributed
representation.
SubSampling recognizingalargedistributedpatternbymatchingonly
asmallsubsetoftheactivebitsinthelargepattern
Synapse connectionbetweenCellsformedwhilelearning
TemporalPooling theprocessofformingarepresentationofasequenceof
inputpatternswheretheresultingrepresentationis
morestablethantheinput
TopDown synonymforFeedback
VariableOrderPrediction apredictionbasedonvaryingamountsofpriorcontext
comparetoFirstOrderPrediction
Itiscalledvariablebecausethememorytomaintain
[Link]
systemthatimplementsvariableorderpredictioncanuse
contextgoingwaybackintimewithoutrequiring
exponentialamountsofmemory.
Numenta2010 Page51
AppendixA:AComparisonbetweenBiologicalNeuronsand
HTMCells
Theimageaboveshowsapictureofabiologicalneuronontheleft,asimpleartificial
neuroninthemiddle,[Link]
appendixistoprovideabetterunderstandingofHTMcellsandhowtheyworkby
comparingthemtorealneuronsandsimplerartificialneurons.
[Link]
[Link]
manydetailsofrealneurons,thecellsusedintheHTMcorticallearningalgorithms
[Link]
theelementsincludedinHTMcellsarenecessaryfortheoperationofanHTM
region.
Biologicalneurons
[Link]
[Link]
[Link]
[Link]
[Link]
[Link]
[Link]
corticallearningalgorithmstakeadvantageofthesenonlinearproperties.
Numenta2010 Page52
Neuronshaveseveralparts.
Cellbody
[Link],
theaxon,[Link]
alongthedendriteswhichfeedtothecellbody.
ProximalDendrites
[Link]
diagramsomeoftheproximaldendritesaremarkedwithgreenlines.
Multipleactivesynapsesonproximaldendriteshavearoughlylinearadditiveeffect
[Link]
[Link],ifa
singlesynapseisactivatedrepeatedlybyaquicksuccessionofactionpotentials,the
second,third,andsubsequentactionpotentialshavemuchlesseffectatthecell
body,thanthefirst.
Therefore,wecansaythatinputstotheproximaldendritessumlinearlyatthecell
body,andthatrapidspikesarrivingatasinglesynapsewillhaveonlyaslightly
largereffectthanasinglespike.
Thefeedforwardconnectionstoaregionofneocortexpreferentiallyconnecttothe
proximaldendrites.Thishasbeenreportedatleastforlayer4neurons,theprimary
inputlayerofneuronsineachregion.
DistalDendrites
[Link]
diagramsomeofthedistaldendritesaremarkedwithbluelines.
[Link]
dendritesatbranchesinthedendritictreeanddonotconnectdirectlytothecell
[Link]
[Link],ithasaminimal
[Link]
[Link]
[Link],whicharethemajorityofsynapsesona
neuron,couldntdomuch.
Wenowknowthatsectionsofdistaldendritesactassemiindependentprocessing
[Link]
alongthedendrite,theycangenerateadendriticspikethatcantraveltothecell
[Link],twentyactivesynapseswithin40umofeach
otherwillgenerateadendriticspike.
Numenta2010 Page53
Therefore,wecansaythatthedistaldendritesactlikeasetofthresholdcoincidence
detectors.
Thesynapsesformedondistaldendritesarepredominantlyfromothercellsnearby
intheregion.
Theimageshowsalargedendritebranchextendingupwardswhichiscalledthe
[Link]
severaldistaldendritesinanareawheretheycanmoreeasilymakeconnectionsto
[Link],theapicaldendriteactsasanextensionofthe
cell.
Synapses
[Link]
(perhaps90%)ofthesewillbeondistaldendrites,andtherestwillbeonproximal
dendrites.
Formanyyearsitwasassumedthatlearninginvolvedstrengtheningandweakening
[Link],each
[Link],itwillnotreliablyreleasea
[Link]
precisionorfidelityofindividualsynapseweights.
Further,[Link]
flexibilityrepresentsapowerfulformoflearningandbetterexplainstherapid
[Link]
withinacertaindistance,[Link]
assumptions,learningoccurslargelybyformingvalidsynapsesfrompotential
synapses.
NeuronOutput
Theoutputofaneuronisaspike,oractionpotential,whichpropagatesalongthe
[Link]
[Link]
[Link]
neuronabove,[Link]
representthataxon.
Althoughtheactualoutputofaneuronisalwaysaspike,therearedifferentviews
[Link](especiallyinregardstothe
neocortex)[Link]
canbeviewedasascalarvalue.
Someneuronsalsoexhibitaburstingbehavior,ashortandfastseriesofafew
spikesthataredifferentthantheregularspikingpattern.
Numenta2010 Page54
Theabovedescriptionofaneuronisintendedtogiveabriefintroductionto
[Link]
[Link].
[Link]
neuronscouldeasilyfillseveralbooks,andactiveresearchonneuronscontinues
today.
Simpleartificialneurons
ThemiddleimageatthebeginningofthisAppendixshowsaneuronlikeelement
[Link]
[Link],
[Link]
summedinanonlinearfashiontoproduceanoutputoftheartificialneuron.
Learningoccursbyadjustingtheweightsofthesynapsesandperhapsthenon
linearfunction.
Thistypeofartificialneuron,andvariationsofit,hasprovenusefulinmany
[Link],itdoesntcapturemuchof
[Link]
understandandmodelhowanensembleofrealneuronsworksinthebrainweneed
amoresophisticatedneuronmodel.
HTMcells
Inourillustration,theimageontherightdepictsacellusedintheHTMcortical
[Link]
realneuronsbutalsomakesseveralsimplifications.
ProximalDendrite
[Link]
madeviasynapses(shownasgreendots).Theactivityofsynapsesislinearly
summedtoproduceafeedforwardactivationforthecell.
[Link]
[Link]
forceallthecellsinacolumntoshareasingleproximaldendrite.
Toavoidhavingcellsthatneverwininthecompetitionwithneighboringcells,an
HTMcellwillboostitsfeedforwardactivationifitisnotwinningenoughrelativeto
[Link],inanHTM
wemodelthisasacompetitionbetweencolumns,[Link]
illustratedinthediagram.
Numenta2010 Page55
Finally,theproximaldendritehasanassociatedsetofpotentialsynapseswhichisa
[Link],itincreasesordecreasesthe
[Link]
thosepotentialsynapsesthatareaboveathresholdarevalid.
Asmentionedearlier,theconceptofpotentialsynapsescomesfrombiologywhereit
[Link]
[Link]
axonsonbiologicalneuronscangrowandretractaslearningoccursandtherefore
[Link]
synapsesonanHTMcelllarge,weroughlyachievethesameresultasaxonand
[Link].
Thecombinationofcompetitionbetweencolumns,learningfromasetofpotential
synapses,andboostingunderutilizedcolumnsgivesaregionofHTMneuronsa
[Link]
whateachcolumnrepresents(viachangestothesynapsesontheproximal
dendrites)iftheinputchanges,orthenumberofcolumnsincreasesordecreases.
DistalDendrites
[Link]
[Link](shownasblue
dotsontheearlierdiagram)isaboveathreshold,thesegmentbecomesactive,and
[Link]
oftheactivationsofitssegments.
Adendritesegmentremembersthestateoftheregionbyformingconnectionsto
[Link]
[Link]
[Link]
foradendritesegmentis15.If15validsynapsesonasegmentareactiveatonce,
[Link]
nearby,butconnectingtoonly15issufficienttorecognizethelargerpattern.
[Link]
[Link]
learns,itincreasesordecreasesthepermanencevalueofallitspotentialsynapses.
Onlythosepotentialsynapsesthatareaboveathresholdarevalid.
Inoneimplementation,[Link]
anotherimplementation,[Link]
[Link],itispossibleto
[Link],saywe
have20validsynapsesonasegmentandathresholdof15.(Ingeneralwewantthe
thresholdtobelessthanthenumberofsynapsestoimprovenoiseimmunity.)The
Numenta2010 Page56
[Link]
happenifweaddedanother20synapsestothesamesegmentrepresentingan
entirelydifferentstateofcellsnearby?Itintroducesthepossibilityoferrorbecause
thesegmentcouldadd8activesynapsesfromonepatternand7activesynapses
[Link]
upto20differentpatternscanbestoredononesegmentbeforeerrorsoccur.
ThereforeanHTMcellwithadozendendritesegmentscanparticipateinmany
differentpredictions.
Synapses
[Link]
thatprecludesscalarsynapseweights,butduetotheuseofsparsedistributed
patternswehavenotyethadaneedtousescalarweights.
However,synapsesonanHTMcellhaveascalarvaluecalledpermanencewhichis
adjustedduringlearning.A0.0permanencevaluerepresentsapotentialsynapse
whichisnotvalidandhasnotprogressedatalltowardsbecomingavalidsynapse.
Apermanencevalueaboveathreshold(typically0.2)representsasynapsethathas
[Link],for
example0.9,representsasynapsethatisconnectedandcannoteasilybeun
connected.
Thenumberofvalidsynapsesontheproximalanddistaldendritesegmentsofan
[Link],the
numberofvalidsynapsesonthedistaldendritesisdependentonthetemporal
[Link]
region,thenallthesynapsesondistalsegmentswouldhavelowpermanencevalues
[Link]
inputstream,thenwewillfindmanyvalidsynapseswithhighpermanence.
CellOutput
AnHTMcellhastwodifferentbinaryoutputs:1)thecellisactiveduetofeed
forwardinput(viatheproximaldendrite),and2)thecellisactiveduetolateral
connections(viathedistaldendritesegments).Theformeriscalledtheactive
stateandthelatteriscalledthepredictivestate.
Intheearlierdiagram,thetwooutputsarerepresentedbythetwolinesexitingthe
[Link],whiletherightlineis
thepredictivestate.
Onlythefeedforwardactivestateisconnectedtoothercellsintheregion,ensuring
thatpredictionsarealwaysbasedonthecurrentinput(pluscontext).Wedont
[Link],almostallthecellsinthe
regionwouldbeinthepredictivestateafterafewiterations.
Numenta2010 Page57
[Link]
[Link]
[Link]
predictivestates,theoutputofourregionwillbemorestable(slowerchanging)
[Link].
Suggestedreading
Weareoftenaskedtosuggestreadingmaterialstolearnmoreaboutneuroscience.
Thefieldofneuroscienceissolargethatageneralintroductionrequireslookingat
[Link]
bothhardtoreadandhardtogetaccesstoifyoudonthaveauniversityaffiliation.
Herearetworeadilyavailablebooksthatadedicatedreadermightwanttolookat
whicharerelevanttothetopicsinthisappendix.
Stuart,Greg,Spruston,Nelson,Hauser,Michael,Dendrites,secondedition(New
York:OxfordUniversityPress,2008)
Thisbookisagoodsourceoneverythingaboutdendrites.Chapter16discussesthe
nonlinearpropertiesofdendritesegmentsusedintheHTMcorticallearning
[Link]
field.
Mountcastle,[Link]:TheCerebralCortex
(Cambridge,Mass.:HarvardUniversityPress,1998)
[Link]
[Link]
corticalneuronsandtheirconnections,althoughitistoooldtocoverthelatest
knowledgeofdendriteproperties.