WHERE FUTURE BEGINS
  • ṢELF ḌEEP ḶEARNING
  • LSE MBA Essentials - The London School of Economics
    • Leading with influence
    • Economics for managers
    • Competitive strategy
    • Corporate strategy
    • Financial accounting
    • Management accounting
    • Analysing financial statements
    • In the mind of the manager
    • Nudging behaviour
    • Organisational culture as a leadership tool
  • Business Foundations Specialization - Wharton Online
    • Introduction to Marketing
      • BRANDING: Marketing Strategy and Brand Positioning
      • Marketing 101: Building Strong Brands Part I
      • Marketing 101: Building Strong Brands Part II
      • Strategic Marketing
      • Segmentation and Targeting
      • Brand Positioning
      • Brand Mantra: The Elevator Speech
      • Experiential Branding
      • CUSTOMER CENTRICITY: The Limits of Product-Centric Thinking
      • Cracks in the Product-Centric Approach
      • Data-Driven Business Models
      • Three Cheers for Direct Marketing
      • Which Firms Are Customer Centric?
      • What is Customer Centricity?
      • Living in a Customer-Centric World
      • More Reflections on Customer CentricityPrev
      • Questions on Customer Centricity
      • GO TO MARKET STRATEGIES: Online-Offline Interaction
      • Online/Offline Competition
      • Friction
      • The Long Tail Theory
      • Preference Isolation
      • How Internet Retailing Startups Grow
      • Customers and Digital Marketing
      • Influence and How Information Spreads
      • Pricing Strategies
      • The 7M
      • BRANDING: Effective Brand Communications Strategies and Repositioning Strategies
      • Brand Messaging & Communication
      • Brand Elements: Choosing a Brand Name
      • Brand Elements: Color & Taglines
      • Brand Elements: Packaging
      • Brand Elements: Persuasion
      • Repositioning a Brand
    • Introduction to Financial Accounting
      • 1.1.1: Financial Reporting Overview
      • 1.1.2: Financial Reporting Example
    • Managing Social and Human Capital
      • Professor Cappelli and Professor Useem Introductions
    • Introduction to Corporate Finance
      • Time Value of Money
      • Intuition and Discounting
      • Compounding
      • Useful Shortcuts
      • Taxes
      • Inflation
      • APR and EAR
      • Term Structure
      • Discounted Cash Flow: Decision Making
      • Discounted Cash Flow Analysis
      • Forecast Drivers
      • Forecasting Free Cash Flow
      • Decision Criteria
      • Sensitivity Analysis
      • Return on Investment
    • Introduction to Operations Management
    • Wharton Business Foundations Capstone
  • Artificial Intelligence Career Program - deeplearning.ai
    • Machine Learning
      • Introduction to Machine Learning
      • Supervised Learning
      • Unsupervised Learning
      • Model Representation - Linear Regression
      • Cost Function
      • Gradient Descent
      • Gradient Descent For Linear Regression
      • Linear Algebra
    • Deep Learning
    • Neutral Networks and Deep Learning
      • Introduction to Deep Learning
      • What is a neural network?
      • Supervised Learning with Neural Networks
      • Why is Deep Learning taking off?
      • About this Course
      • Binary Classification
      • Logistic Regression
      • Gradient Descent
      • Derivatives
      • Computation graph
      • Derivatives with a Computation Graph
      • Logistic Regression Gradient Descent
      • Vectorization
      • Vectorizing Logistic Regression
      • Vectorizing Logistic Regression's Gradient Output
      • Broadcasting in Python
      • A note on python/numpy vectors
      • Explanation of logistic regression cost function (optional)
      • Neural Networks Overview
      • Neural Network Representation
      • Computing a Neural Network's Output
      • Vectorizing across multiple examples
      • Activation functions
      • Derivatives of activation functions
      • Gradient descent for Neural Networks
      • Backpropagation intuition (optional)
      • Random Initialization
      • Deep L-layer neural network
      • Forward Propagation in a Deep Network
      • Getting your matrix dimensions right
      • Why deep representations?
      • Building blocks of deep neural networks
      • Forward and Backward Propagation
      • Parameters vs Hyperparameters
      • What does this have to do with the brain?
    • Convolutional Neural Networks
      • Computer Vision
      • Edge Detection Example
      • Padding
      • Strided Convolutions
      • Convolutions Over Volume
      • One Layer of a Convolutional Network
      • Simple Convolutional Network Example
      • Pooling Layers
      • CNN Example - Fully Connected Layers
      • Why Convolutions?
    • Neural Network Theory [ETH]
    • Natural Language Processing
    • Computer Vision
  • IBM Data Science Professional Certificate
    • What is Data Science?
    • Open Source tools for Data Science
    • Data Science Methodology
    • Python for Data Science and AI
    • Databases and SQL for Data Science
    • Data Analysis with Python
    • Data Visualization with Python
    • Machine Learning with Python
    • Applied Data Science Capstone
  • Data Analytics
    • Python for Data Analysis
    • Data Structure and Algorithms
  • Programming Language
    • Python
    • R
    • SQL
    • C++
    • C
    • Java
    • HTML
  • 机器学习工程师
  • 商业数据分析
Powered by GitBook
On this page

Was this helpful?

  1. Artificial Intelligence Career Program - deeplearning.ai
  2. Neutral Networks and Deep Learning

What does this have to do with the brain?

PreviousParameters vs HyperparametersNextConvolutional Neural Networks

Last updated 5 years ago

Was this helpful?

深度学习与大脑之间有什么相似性呢? 我总结之后 觉得它们之间的相似度并不高 我们先来看一下 为什么人们往往喜欢在 深度学习与人类大脑两者间进行比较 当你构建神经网络系统时 你会运用前向传播 和反向传播 由于我们很难去直观地解释 这些复杂的方程为什么能实现理想的效果 而将深度学习和人脑类比则 让这个过程过于简化却更便于说明 这种解释的简易程度让大众更轻易地 在各种媒介提及,使用或报道它 并且无疑地激发了公众的想象力 其实这之间确实有一些可对比的方向,比如说 逻辑回归单位和sigmoid激活函数 这是一个大脑神经元的图像 在这张生物意义上的神经元的图上,这个神经元 其实是你大脑的一个细胞, 它会接到其他神经元传来的电流信号 比如神经元x1,x2,x3,或者其他神经元a1,a2,a3 会做一个简单的阈值计算 然后如果这个神经元被触发 他会沿树突传送一股电流,在方向上一直传送 可能送至其他的神经元 所以存在一个简单地类比 这个类比体现在在独立的逻辑单位 神经网络中一个独立的神经元 和一个生物意义上的神经元(如右图所示)之间 但是我认为,直至今天 即便是神经学专家也几乎不清楚 哪怕是一个单一的神经元如何运作 一个神经元其实复杂得多 相比于我们用神经科学所够描述的 而它做的其中一部分工作有些像逻辑回归 但是还有很多单一神经元的工作 至今没有任何一个人,一个人类理解 比如 人脑中的神经元究竟如何学习依旧是个谜 而且至今我们不清楚到底人脑有没有一个类似 反向传播或者梯度下降的算法,亦或是 人脑运用了一个完全不同的学习原理 所以当我理解深度学习,我认为它非常善于学习 非常灵活的方程,非常复杂的方程, x到y的映射, 监督学习中输入到输出映射 而对于深度学习和大脑的类比,也许曾经是有用的 但是我认为,这个领域已经进步到了 可以打破这个类比的阶段 我倾向于不再运用这个类比 所以这就是我想说的神经网络和大脑的部分 我确实认为计算机视觉方向 相比于其他深度学习影响的学科,得到了更多的灵感 得到了更多的灵感 但是我个人比以前更少地使用人脑进行对比了 好,这就是这次课程的内容 你现在知道如何运用正向传导,反向传播和 梯度下降,以及深层的神经网络结构 希望你们能顺利做完课后练习 我期待在下节课程中和你们分享更多 GTC字幕组翻译