SlideShare a Scribd company logo
1 of 32
Sparse Distributed Representations:
Our Brain’s Data Structure
Numenta Workshop
October 17, 2014
Subutai Ahmad, VP Research
sahmad@numenta.com
Sparse Distributed Representations:
Our Brain’s Data Structure
Numenta Workshop
October 17, 2014
Subutai Ahmad, VP Research
sahmad@numenta.com
The Role of Sparse Distributed Representations in Cortex
1) Sensory perception
3) Motor control
4) Prediction
2) Planning
5) Attention
Sparse Distribution Representations (SDRs) are the foundation for all these
functions, across all sensory modalities
Analysis of this common cortical data structure can provide a rigorous
foundation for cortical computing
Talk Outline
1) Introduction to Sparse Distributed Representations (SDRs)
2) Fundamental properties of SDRs
– Error bounds
– Scaling laws
From: Prof. Hasan, Max-Planck-
Institut for Research
Basics Attributes of SDRs
1) Only a small number of neurons are firing
at any point in time
3) Every cell represents something and has
meaning
4) Information is distributed and no single
neuron is critical
2) There are a very large number of neurons
5) Every neuron only connects to a subset of
other neurons
6) SDRs enable extremely fast computation
7) SDRs are binary
x = 0100000000000000000100000000000110000000
Multiple input SDR’s Single bit in an output SDR
How Does a Single Neuron Operate on SDRs?
Proximal segments
represent dozens of
separate patterns in a single
segment
How Does a Single Neuron Operate on SDRs?
Hundreds of distal segments each detect a
unique SDR using a threshold
Feedback SDR
Context SDR
Bottom-up input SDR
In both cases each synapse corresponds to one bit in
the incoming high dimensional SDR
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent dynamic set of patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
Notation
• We represent a SDR vector as a vector with n binary values
where each bit represents the activity of a single neuron:
• s = percent of ON bits, w = number of ON bits
x =[b0,… ,bn-1]
wx = s ´ n = x 1
Example
• n = 40, s = 0.1, w = 4
• Typical range of numbers in HTM implementations:
n = 2048 to 65,536 s = 0.05% to 2% w = 40
y =1000000000000000000100000000000110000000
x = 0100000000000000000100000000000110000000
SDRs Have Extremely High Capacity
• The number of unique patterns that can be represented is:
• This is far smaller than 2n, but far larger than any reasonable need
• Example: with n = 2048 and w = 40,
the number of unique patterns is > 1084 >> # atoms in universe
• Chance that two random vectors are identical is essentially zero:
n
w
æ
èç
ö
ø÷ =
n!
w! n - w( )!
1/
n
w
æ
èç
ö
ø÷
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
Similarity Metric for Recognition of SDR Patterns
• We don’t use typical vector similarities
– Neurons cannot compute Euclidean or Hamming distance between SDRs
– Any p-norm requires full connectivity
• Compute similarity using an overlap metric
– The overlap is simply the number of bits in common
– Requires only minimal connectivity
– Mathematically, take the AND of two vectors and compute its length
• Detecting a “Match”
– Two SDR vectors “match” if their overlap meets a minimum threshold
overlap(x,y) º x Ù y
match(x,y) º overlap(x,y) ³q
q
Overlap example
• N=40, s=0.1, w=4
• The two vectors have an overlap of 3, so they “match” if the
threshold is 3.
y =1000000000000000000100000000000110000000
x = 0100000000000000000100000000000110000000
How Accurate is Matching With Noise?
• As you decrease the match threshold , you decrease sensitivity and increase
robustness to noise
• You also increase the chance of false positives
Decrease
q
q
How Many Vectors Match When You Decrease the Threshold?
• Define the “overlap set of x” to be the set of
vectors with exactly b bits of overlap with x
• The number of such vectors is:
Wx (n,w,b) =
wx
b
æ
èç
ö
ø÷ ´
n - wx
w - b
æ
èç
ö
ø÷
Wx (n,w,b)
Number subsets of x with
exactly b bits ON
Number patterns occupying the rest
of the vector with exactly w-b bits ON
Error Bound for Classification with Noise
• Give a single stored pattern, probability of false positive is:
• Given M patterns, probability of a false positive is:
fpw
n
(q) =
Wx (n,w,b)
b=q
w
å
n
w
æ
èç
ö
ø÷
fpX (q) £ fpwxi
n
(q)
i=0
M-1
å
What Does This Mean in Practice?
• With SDRs you can classify a huge number of patterns with substantial noise
(if n and w are large enough)
Examples
• n = 2048, w = 40
With up to 14 bits of noise (33%), you can classify a quadrillion
patterns with an error rate of less than 10-24
With up to 20 bits of noise (50%), you can classify a quadrillion
patterns with an error rate of less than 10-11
• n = 64, w=12
With up to 4 bits of noise (33%), you can classify 10 patterns
with an error rate of 0.04%
Neurons Are Highly Robust Pattern Recognizers
Hundreds of distal segments each detect a
unique SDR using a threshold
You can have tens of thousands of neurons examining a single input SDR, and very
robustly matching complex patterns
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
SDRs are Robust to Random Deletions
• In cortex bits in an SDR can randomly disappear
– Synapses can be quite unreliable
– Individual neurons can die
– A patch of cortex can be damaged
• The analysis for random deletions is very similar to noise
• SDRs can naturally handle fairly significant random failures
– Failures are tolerated in any SDR and in any part of the system
• This is a great property for those building HTM based hardware
– The probability of failures can be exactly characterized
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
Representing Multiple Patterns in a Single SDR
• There are situations where we want to store multiple patterns within a single SDR
and match them
• In temporal inference the system might make multiple predictions about the future
Example
Unions of SDRs
• We can store a set of patterns in a single fixed representation by taking the OR of
all the individual patterns
• The vector representing the union is also going to match a large number of other
patterns that were not one of the original 10
• How many such patterns can we store reliably, without a high chance of false
positives?
Is this SDR
a member?
1)
2)
3)
….
10)
2%
< 20%Union
Error Bounds for Unions
• Expected number of ON bits:
• Give a union of M patterns, the expected probability of a false positive (with
noise) is:
What Does This Mean in Practice?
• You can form reliable unions of a reasonable number of patterns (assuming
large enough n and w)
Examples
• n = 2048, w = 40
The union of 50 patterns leads to an error rate of 10-9
• n = 512, w=10
The union of 50 patterns leads to an error rate of 0.9%
• Extremely high capacity
• Recognize patterns in the presence of noise
• Robust to random deletions
• Represent multiple patterns in a single fixed structure
• Extremely efficient
Fundamental Properties of SDRs
SDRs Enable Highly Efficient Operations
• In cortex complex operations are carried out rapidly
– Visual system can perform object recognition in 100-150 msecs
• SDR vectors are large, but all operations are O(w) and independent of
vector size
– No loops or optimization process required
• Matching a pattern against a dynamic list (unions) is O(w) and
independent of the number of items in the list
• Enables a tiny dendritic segment to perform robust pattern recognition
• We can simulate 200,000 neurons in software at about 25-50Hz
Summary
• SDR’s are the common data structure in the cortex
• SDR’s enable flexible recognition systems that have very high capacity, and are
robust to a large amount of noise
• The union property allows a fixed representation to encode a dynamically
changing set of patterns
• The analysis of SDR’s provides a principled foundation for characterizing the
behavior of the HTM learning algorithms and all cognitive functions
• Sparse memory (Kanerva), Sparse coding (Olshausen), Bloom filters (Broder)
Related work
Questions? Math jokes?
Follow us on Twitter @numenta
Sign up for our newsletter at www.numenta.com
Subutai Ahmad
sahmad@numenta.com
nupic-theory mailing list
numenta.org/lists

More Related Content

What's hot

Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...Numenta
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...Numenta
 
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...Numenta
 
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...Numenta
 
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...Numenta
 
Sparsity In The Neocortex, And Its Implications For Machine Learning
Sparsity In The Neocortex,  And Its Implications For Machine LearningSparsity In The Neocortex,  And Its Implications For Machine Learning
Sparsity In The Neocortex, And Its Implications For Machine LearningNumenta
 
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...Numenta
 
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ... Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ...
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...Numenta
 
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)Numenta
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...Christy Maver
 
RNN Explore
RNN ExploreRNN Explore
RNN ExploreYan Kang
 
Artificial neural networks and its applications
Artificial neural networks and its applications Artificial neural networks and its applications
Artificial neural networks and its applications PoojaKoshti2
 
Artificial Neural Networks: Pointers
Artificial Neural Networks: PointersArtificial Neural Networks: Pointers
Artificial Neural Networks: PointersFariz Darari
 
Intro to Neural Networks
Intro to Neural NetworksIntro to Neural Networks
Intro to Neural NetworksDean Wyatte
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications Ahmed_hashmi
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its applicationHưng Đặng
 
Neural Networks
Neural Networks Neural Networks
Neural Networks Eric Su
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKSESCOM
 
Neural networks
Neural networksNeural networks
Neural networksBasil John
 

What's hot (20)

Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
Jeff Hawkins NAISys 2020: How the Brain Uses Reference Frames, Why AI Needs t...
 
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
The Predictive Neuron: How Active Dendrites Enable Spatiotemporal Computation...
 
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
Could A Model Of Predictive Voting Explain Many Long-Range Connections? by Su...
 
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
Jeff Hawkins Human Brain Project Summit Keynote: "Location, Location, Locatio...
 
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...Location, Location, Location - A Framework for Intelligence and Cortical Comp...
Location, Location, Location - A Framework for Intelligence and Cortical Comp...
 
Sparsity In The Neocortex, And Its Implications For Machine Learning
Sparsity In The Neocortex,  And Its Implications For Machine LearningSparsity In The Neocortex,  And Its Implications For Machine Learning
Sparsity In The Neocortex, And Its Implications For Machine Learning
 
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
Locations in the Neocortex: A Theory of Sensorimotor Prediction Using Cortica...
 
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ... Have We Missed Half of What the Neocortex Does?  A New Predictive Framework ...
Have We Missed Half of What the Neocortex Does? A New Predictive Framework ...
 
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
Brains, Data, and Machine Intelligence (2014 04 14 London Meetup)
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
 
RNN Explore
RNN ExploreRNN Explore
RNN Explore
 
Neural networks
Neural networksNeural networks
Neural networks
 
Artificial neural networks and its applications
Artificial neural networks and its applications Artificial neural networks and its applications
Artificial neural networks and its applications
 
Artificial Neural Networks: Pointers
Artificial Neural Networks: PointersArtificial Neural Networks: Pointers
Artificial Neural Networks: Pointers
 
Intro to Neural Networks
Intro to Neural NetworksIntro to Neural Networks
Intro to Neural Networks
 
Neural network & its applications
Neural network & its applications Neural network & its applications
Neural network & its applications
 
Artificial neural networks and its application
Artificial neural networks and its applicationArtificial neural networks and its application
Artificial neural networks and its application
 
Neural Networks
Neural Networks Neural Networks
Neural Networks
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKS
 
Neural networks
Neural networksNeural networks
Neural networks
 

Viewers also liked

Predictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligencePredictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligenceNumenta
 
a tour of several popular tensorflow models
a tour of several popular tensorflow modelsa tour of several popular tensorflow models
a tour of several popular tensorflow modelsjtoy
 
What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence Numenta
 
Recognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisRecognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisNumenta
 
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisThe Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisNumenta
 
Getting Started with Numenta Technology
Getting Started with Numenta Technology Getting Started with Numenta Technology
Getting Started with Numenta Technology Numenta
 
Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)Numenta
 
Beginner's Guide to NuPIC
Beginner's Guide to NuPICBeginner's Guide to NuPIC
Beginner's Guide to NuPICNumenta
 
HTM Spatial Pooler
HTM Spatial PoolerHTM Spatial Pooler
HTM Spatial PoolerNumenta
 
TouchNet preview at Numenta
TouchNet preview at NumentaTouchNet preview at Numenta
TouchNet preview at Numentajtoy
 

Viewers also liked (10)

Predictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine IntelligencePredictive Analytics with Numenta Machine Intelligence
Predictive Analytics with Numenta Machine Intelligence
 
a tour of several popular tensorflow models
a tour of several popular tensorflow modelsa tour of several popular tensorflow models
a tour of several popular tensorflow models
 
What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence What the Brain says about Machine Intelligence
What the Brain says about Machine Intelligence
 
Recognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus LewisRecognizing Locations on Objects by Marcus Lewis
Recognizing Locations on Objects by Marcus Lewis
 
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. LouisThe Biological Path Towards Strong AI Strange Loop 2017, St. Louis
The Biological Path Towards Strong AI Strange Loop 2017, St. Louis
 
Getting Started with Numenta Technology
Getting Started with Numenta Technology Getting Started with Numenta Technology
Getting Started with Numenta Technology
 
Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)Applications of Hierarchical Temporal Memory (HTM)
Applications of Hierarchical Temporal Memory (HTM)
 
Beginner's Guide to NuPIC
Beginner's Guide to NuPICBeginner's Guide to NuPIC
Beginner's Guide to NuPIC
 
HTM Spatial Pooler
HTM Spatial PoolerHTM Spatial Pooler
HTM Spatial Pooler
 
TouchNet preview at Numenta
TouchNet preview at NumentaTouchNet preview at Numenta
TouchNet preview at Numenta
 

Similar to Sparse Distributed Representations: Our Brain's Data Structure

Hierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly DetectionHierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly DetectionIhor Bobak
 
is2015_poster
is2015_posteris2015_poster
is2015_posterJan Svec
 
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNINGARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNINGmohanapriyastp
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationMohammed Bennamoun
 
Introduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMAIntroduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMABidhan Ghimire
 
Wits presentation 6_28072015
Wits presentation 6_28072015Wits presentation 6_28072015
Wits presentation 6_28072015Beatrice van Eden
 
SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...Chester Chen
 
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language UnderstandingTed Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language UnderstandingMLconf
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)James Boulie
 
Lecture on Deep Learning
Lecture on Deep LearningLecture on Deep Learning
Lecture on Deep LearningYasas Senarath
 
Neural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep LearningNeural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep Learningcomifa7406
 
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentationInformation Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentationXavier Anguera
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learningViet-Trung TRAN
 
Lect1_Threshold_Logic_Unit lecture 1 - ANN
Lect1_Threshold_Logic_Unit  lecture 1 - ANNLect1_Threshold_Logic_Unit  lecture 1 - ANN
Lect1_Threshold_Logic_Unit lecture 1 - ANNMostafaHazemMostafaa
 
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Peter Morovic
 
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...Alexander Gorban
 
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional NetworksFeasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional NetworksSangjun Han
 
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Förderverein Technische Fakultät
 
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...Ana Marasović
 

Similar to Sparse Distributed Representations: Our Brain's Data Structure (20)

Hierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly DetectionHierarchical Temporal Memory for Real-time Anomaly Detection
Hierarchical Temporal Memory for Real-time Anomaly Detection
 
is2015_poster
is2015_posteris2015_poster
is2015_poster
 
CNN for modeling sentence
CNN for modeling sentenceCNN for modeling sentence
CNN for modeling sentence
 
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNINGARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
ARTIFICIAL-NEURAL-NETWORKMACHINELEARNING
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computation
 
Introduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMAIntroduction to spred spectrum and CDMA
Introduction to spred spectrum and CDMA
 
Wits presentation 6_28072015
Wits presentation 6_28072015Wits presentation 6_28072015
Wits presentation 6_28072015
 
SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...SF Big Analytics20170706: What the brain tells us about the future of streami...
SF Big Analytics20170706: What the brain tells us about the future of streami...
 
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language UnderstandingTed Willke - The Brain’s Guide to Dealing with Context in Language Understanding
Ted Willke - The Brain’s Guide to Dealing with Context in Language Understanding
 
Artificial Neural Network (draft)
Artificial Neural Network (draft)Artificial Neural Network (draft)
Artificial Neural Network (draft)
 
Lecture on Deep Learning
Lecture on Deep LearningLecture on Deep Learning
Lecture on Deep Learning
 
Neural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep LearningNeural Networks for Machine Learning and Deep Learning
Neural Networks for Machine Learning and Deep Learning
 
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentationInformation Retrieval Dynamic Time Warping - Interspeech 2013 presentation
Information Retrieval Dynamic Time Warping - Interspeech 2013 presentation
 
From neural networks to deep learning
From neural networks to deep learningFrom neural networks to deep learning
From neural networks to deep learning
 
Lect1_Threshold_Logic_Unit lecture 1 - ANN
Lect1_Threshold_Logic_Unit  lecture 1 - ANNLect1_Threshold_Logic_Unit  lecture 1 - ANN
Lect1_Threshold_Logic_Unit lecture 1 - ANN
 
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
Analysis and Compression of Reflectance Data Using An Evolved Spectral Correl...
 
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
Errors of Artificial Intelligence, their Correction and Simplicity Revolution...
 
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional NetworksFeasibility of EEG Super-Resolution Using Deep Convolutional Networks
Feasibility of EEG Super-Resolution Using Deep Convolutional Networks
 
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
Nonequilibrium Network Dynamics_Inference, Fluctuation-Respones & Tipping Poi...
 
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
Poster for RepL4NLP - Multilingual Modal Sense Classification Using a Convolu...
 

More from Numenta

Deep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesDeep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesNumenta
 
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyBrains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyNumenta
 
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiBrains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiNumenta
 
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Numenta
 
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Numenta
 
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Numenta
 
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenSBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenNumenta
 
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...Numenta
 
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroOpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroNumenta
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...Numenta
 
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...Numenta
 
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)Numenta
 
Biological path toward strong AI
Biological path toward strong AIBiological path toward strong AI
Biological path toward strong AINumenta
 

More from Numenta (13)

Deep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devicesDeep learning at the edge: 100x Inference improvement on edge devices
Deep learning at the edge: 100x Inference improvement on edge devices
 
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth RamaswamyBrains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
Brains@Bay Meetup: A Primer on Neuromodulatory Systems - Srikanth Ramaswamy
 
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas MiconiBrains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
Brains@Bay Meetup: How to Evolve Your Own Lab Rat - Thomas Miconi
 
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
Brains@Bay Meetup: The Increasing Role of Sensorimotor Experience in Artifici...
 
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
Brains@Bay Meetup: Open-ended Skill Acquisition in Humans and Machines: An Ev...
 
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
Brains@Bay Meetup: The Effect of Sensorimotor Learning on the Learned Represe...
 
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence SpracklenSBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
SBMT 2021: Can Neuroscience Insights Transform AI? - Lawrence Spracklen
 
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
FPGA Conference 2021: Breaking the TOPS ceiling with sparse neural networks -...
 
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve OmohundroOpenAI’s GPT 3 Language Model - guest Steve Omohundro
OpenAI’s GPT 3 Language Model - guest Steve Omohundro
 
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
CVPR 2020 Workshop: Sparsity in the neocortex, and its implications for conti...
 
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
The Thousand Brains Theory: A Framework for Understanding the Neocortex and B...
 
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
The Biological Path Toward Strong AI by Matt Taylor (05/17/18)
 
Biological path toward strong AI
Biological path toward strong AIBiological path toward strong AI
Biological path toward strong AI
 

Recently uploaded

From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...Florian Roscheck
 
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPramod Kumar Srivastava
 
RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.natarajan8993
 
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...limedy534
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 217djon017
 
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...Boston Institute of Analytics
 
Predictive Analysis for Loan Default Presentation : Data Analysis Project PPT
Predictive Analysis for Loan Default  Presentation : Data Analysis Project PPTPredictive Analysis for Loan Default  Presentation : Data Analysis Project PPT
Predictive Analysis for Loan Default Presentation : Data Analysis Project PPTBoston Institute of Analytics
 
Multiple time frame trading analysis -brianshannon.pdf
Multiple time frame trading analysis -brianshannon.pdfMultiple time frame trading analysis -brianshannon.pdf
Multiple time frame trading analysis -brianshannon.pdfchwongval
 
Identifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanIdentifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanMYRABACSAFRA2
 
20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdfHuman37
 
Saket, (-DELHI )+91-9654467111-(=)CHEAP Call Girls in Escorts Service Saket C...
Saket, (-DELHI )+91-9654467111-(=)CHEAP Call Girls in Escorts Service Saket C...Saket, (-DELHI )+91-9654467111-(=)CHEAP Call Girls in Escorts Service Saket C...
Saket, (-DELHI )+91-9654467111-(=)CHEAP Call Girls in Escorts Service Saket C...Sapana Sha
 
RadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfRadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfgstagge
 
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degreeyuu sss
 
Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Colleen Farrelly
 
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...Jack DiGiovanna
 
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home Service9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home ServiceSapana Sha
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxEmmanuel Dauda
 

Recently uploaded (20)

From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...From idea to production in a day – Leveraging Azure ML and Streamlit to build...
From idea to production in a day – Leveraging Azure ML and Streamlit to build...
 
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptxPKS-TGC-1084-630 - Stage 1 Proposal.pptx
PKS-TGC-1084-630 - Stage 1 Proposal.pptx
 
RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.RABBIT: A CLI tool for identifying bots based on their GitHub events.
RABBIT: A CLI tool for identifying bots based on their GitHub events.
 
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
Effects of Smartphone Addiction on the Academic Performances of Grades 9 to 1...
 
Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2Easter Eggs From Star Wars and in cars 1 and 2
Easter Eggs From Star Wars and in cars 1 and 2
 
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
NLP Data Science Project Presentation:Predicting Heart Disease with NLP Data ...
 
Predictive Analysis for Loan Default Presentation : Data Analysis Project PPT
Predictive Analysis for Loan Default  Presentation : Data Analysis Project PPTPredictive Analysis for Loan Default  Presentation : Data Analysis Project PPT
Predictive Analysis for Loan Default Presentation : Data Analysis Project PPT
 
Multiple time frame trading analysis -brianshannon.pdf
Multiple time frame trading analysis -brianshannon.pdfMultiple time frame trading analysis -brianshannon.pdf
Multiple time frame trading analysis -brianshannon.pdf
 
Identifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population MeanIdentifying Appropriate Test Statistics Involving Population Mean
Identifying Appropriate Test Statistics Involving Population Mean
 
20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf20240419 - Measurecamp Amsterdam - SAM.pdf
20240419 - Measurecamp Amsterdam - SAM.pdf
 
Saket, (-DELHI )+91-9654467111-(=)CHEAP Call Girls in Escorts Service Saket C...
Saket, (-DELHI )+91-9654467111-(=)CHEAP Call Girls in Escorts Service Saket C...Saket, (-DELHI )+91-9654467111-(=)CHEAP Call Girls in Escorts Service Saket C...
Saket, (-DELHI )+91-9654467111-(=)CHEAP Call Girls in Escorts Service Saket C...
 
RadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdfRadioAdProWritingCinderellabyButleri.pdf
RadioAdProWritingCinderellabyButleri.pdf
 
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
毕业文凭制作#回国入职#diploma#degree澳洲中央昆士兰大学毕业证成绩单pdf电子版制作修改#毕业文凭制作#回国入职#diploma#degree
 
Call Girls in Saket 99530🔝 56974 Escort Service
Call Girls in Saket 99530🔝 56974 Escort ServiceCall Girls in Saket 99530🔝 56974 Escort Service
Call Girls in Saket 99530🔝 56974 Escort Service
 
Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024Generative AI for Social Good at Open Data Science East 2024
Generative AI for Social Good at Open Data Science East 2024
 
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
 
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Defence Colony Delhi 💯Call Us 🔝8264348440🔝
 
9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home Service9654467111 Call Girls In Munirka Hotel And Home Service
9654467111 Call Girls In Munirka Hotel And Home Service
 
E-Commerce Order PredictionShraddha Kamble.pptx
E-Commerce Order PredictionShraddha Kamble.pptxE-Commerce Order PredictionShraddha Kamble.pptx
E-Commerce Order PredictionShraddha Kamble.pptx
 
Customer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptxCustomer Service Analytics - Make Sense of All Your Data.pptx
Customer Service Analytics - Make Sense of All Your Data.pptx
 

Sparse Distributed Representations: Our Brain's Data Structure

  • 1. Sparse Distributed Representations: Our Brain’s Data Structure Numenta Workshop October 17, 2014 Subutai Ahmad, VP Research sahmad@numenta.com
  • 2.
  • 3.
  • 4. Sparse Distributed Representations: Our Brain’s Data Structure Numenta Workshop October 17, 2014 Subutai Ahmad, VP Research sahmad@numenta.com
  • 5. The Role of Sparse Distributed Representations in Cortex 1) Sensory perception 3) Motor control 4) Prediction 2) Planning 5) Attention Sparse Distribution Representations (SDRs) are the foundation for all these functions, across all sensory modalities Analysis of this common cortical data structure can provide a rigorous foundation for cortical computing
  • 6. Talk Outline 1) Introduction to Sparse Distributed Representations (SDRs) 2) Fundamental properties of SDRs – Error bounds – Scaling laws
  • 7. From: Prof. Hasan, Max-Planck- Institut for Research
  • 8. Basics Attributes of SDRs 1) Only a small number of neurons are firing at any point in time 3) Every cell represents something and has meaning 4) Information is distributed and no single neuron is critical 2) There are a very large number of neurons 5) Every neuron only connects to a subset of other neurons 6) SDRs enable extremely fast computation 7) SDRs are binary x = 0100000000000000000100000000000110000000
  • 9. Multiple input SDR’s Single bit in an output SDR How Does a Single Neuron Operate on SDRs?
  • 10. Proximal segments represent dozens of separate patterns in a single segment How Does a Single Neuron Operate on SDRs? Hundreds of distal segments each detect a unique SDR using a threshold Feedback SDR Context SDR Bottom-up input SDR In both cases each synapse corresponds to one bit in the incoming high dimensional SDR
  • 11. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent dynamic set of patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 12. Notation • We represent a SDR vector as a vector with n binary values where each bit represents the activity of a single neuron: • s = percent of ON bits, w = number of ON bits x =[b0,… ,bn-1] wx = s ´ n = x 1 Example • n = 40, s = 0.1, w = 4 • Typical range of numbers in HTM implementations: n = 2048 to 65,536 s = 0.05% to 2% w = 40 y =1000000000000000000100000000000110000000 x = 0100000000000000000100000000000110000000
  • 13. SDRs Have Extremely High Capacity • The number of unique patterns that can be represented is: • This is far smaller than 2n, but far larger than any reasonable need • Example: with n = 2048 and w = 40, the number of unique patterns is > 1084 >> # atoms in universe • Chance that two random vectors are identical is essentially zero: n w æ èç ö ø÷ = n! w! n - w( )! 1/ n w æ èç ö ø÷
  • 14. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 15. Similarity Metric for Recognition of SDR Patterns • We don’t use typical vector similarities – Neurons cannot compute Euclidean or Hamming distance between SDRs – Any p-norm requires full connectivity • Compute similarity using an overlap metric – The overlap is simply the number of bits in common – Requires only minimal connectivity – Mathematically, take the AND of two vectors and compute its length • Detecting a “Match” – Two SDR vectors “match” if their overlap meets a minimum threshold overlap(x,y) º x Ù y match(x,y) º overlap(x,y) ³q q
  • 16. Overlap example • N=40, s=0.1, w=4 • The two vectors have an overlap of 3, so they “match” if the threshold is 3. y =1000000000000000000100000000000110000000 x = 0100000000000000000100000000000110000000
  • 17. How Accurate is Matching With Noise? • As you decrease the match threshold , you decrease sensitivity and increase robustness to noise • You also increase the chance of false positives Decrease q q
  • 18. How Many Vectors Match When You Decrease the Threshold? • Define the “overlap set of x” to be the set of vectors with exactly b bits of overlap with x • The number of such vectors is: Wx (n,w,b) = wx b æ èç ö ø÷ ´ n - wx w - b æ èç ö ø÷ Wx (n,w,b) Number subsets of x with exactly b bits ON Number patterns occupying the rest of the vector with exactly w-b bits ON
  • 19. Error Bound for Classification with Noise • Give a single stored pattern, probability of false positive is: • Given M patterns, probability of a false positive is: fpw n (q) = Wx (n,w,b) b=q w å n w æ èç ö ø÷ fpX (q) £ fpwxi n (q) i=0 M-1 å
  • 20. What Does This Mean in Practice? • With SDRs you can classify a huge number of patterns with substantial noise (if n and w are large enough) Examples • n = 2048, w = 40 With up to 14 bits of noise (33%), you can classify a quadrillion patterns with an error rate of less than 10-24 With up to 20 bits of noise (50%), you can classify a quadrillion patterns with an error rate of less than 10-11 • n = 64, w=12 With up to 4 bits of noise (33%), you can classify 10 patterns with an error rate of 0.04%
  • 21. Neurons Are Highly Robust Pattern Recognizers Hundreds of distal segments each detect a unique SDR using a threshold You can have tens of thousands of neurons examining a single input SDR, and very robustly matching complex patterns
  • 22. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 23. SDRs are Robust to Random Deletions • In cortex bits in an SDR can randomly disappear – Synapses can be quite unreliable – Individual neurons can die – A patch of cortex can be damaged • The analysis for random deletions is very similar to noise • SDRs can naturally handle fairly significant random failures – Failures are tolerated in any SDR and in any part of the system • This is a great property for those building HTM based hardware – The probability of failures can be exactly characterized
  • 24. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 25. Representing Multiple Patterns in a Single SDR • There are situations where we want to store multiple patterns within a single SDR and match them • In temporal inference the system might make multiple predictions about the future Example
  • 26. Unions of SDRs • We can store a set of patterns in a single fixed representation by taking the OR of all the individual patterns • The vector representing the union is also going to match a large number of other patterns that were not one of the original 10 • How many such patterns can we store reliably, without a high chance of false positives? Is this SDR a member? 1) 2) 3) …. 10) 2% < 20%Union
  • 27. Error Bounds for Unions • Expected number of ON bits: • Give a union of M patterns, the expected probability of a false positive (with noise) is:
  • 28. What Does This Mean in Practice? • You can form reliable unions of a reasonable number of patterns (assuming large enough n and w) Examples • n = 2048, w = 40 The union of 50 patterns leads to an error rate of 10-9 • n = 512, w=10 The union of 50 patterns leads to an error rate of 0.9%
  • 29. • Extremely high capacity • Recognize patterns in the presence of noise • Robust to random deletions • Represent multiple patterns in a single fixed structure • Extremely efficient Fundamental Properties of SDRs
  • 30. SDRs Enable Highly Efficient Operations • In cortex complex operations are carried out rapidly – Visual system can perform object recognition in 100-150 msecs • SDR vectors are large, but all operations are O(w) and independent of vector size – No loops or optimization process required • Matching a pattern against a dynamic list (unions) is O(w) and independent of the number of items in the list • Enables a tiny dendritic segment to perform robust pattern recognition • We can simulate 200,000 neurons in software at about 25-50Hz
  • 31. Summary • SDR’s are the common data structure in the cortex • SDR’s enable flexible recognition systems that have very high capacity, and are robust to a large amount of noise • The union property allows a fixed representation to encode a dynamically changing set of patterns • The analysis of SDR’s provides a principled foundation for characterizing the behavior of the HTM learning algorithms and all cognitive functions • Sparse memory (Kanerva), Sparse coding (Olshausen), Bloom filters (Broder) Related work
  • 32. Questions? Math jokes? Follow us on Twitter @numenta Sign up for our newsletter at www.numenta.com Subutai Ahmad sahmad@numenta.com nupic-theory mailing list numenta.org/lists