WHERE FUTURE BEGINS
  • ṢELF ḌEEP ḶEARNING
  • LSE MBA Essentials - The London School of Economics
    • Leading with influence
    • Economics for managers
    • Competitive strategy
    • Corporate strategy
    • Financial accounting
    • Management accounting
    • Analysing financial statements
    • In the mind of the manager
    • Nudging behaviour
    • Organisational culture as a leadership tool
  • Business Foundations Specialization - Wharton Online
    • Introduction to Marketing
      • BRANDING: Marketing Strategy and Brand Positioning
      • Marketing 101: Building Strong Brands Part I
      • Marketing 101: Building Strong Brands Part II
      • Strategic Marketing
      • Segmentation and Targeting
      • Brand Positioning
      • Brand Mantra: The Elevator Speech
      • Experiential Branding
      • CUSTOMER CENTRICITY: The Limits of Product-Centric Thinking
      • Cracks in the Product-Centric Approach
      • Data-Driven Business Models
      • Three Cheers for Direct Marketing
      • Which Firms Are Customer Centric?
      • What is Customer Centricity?
      • Living in a Customer-Centric World
      • More Reflections on Customer CentricityPrev
      • Questions on Customer Centricity
      • GO TO MARKET STRATEGIES: Online-Offline Interaction
      • Online/Offline Competition
      • Friction
      • The Long Tail Theory
      • Preference Isolation
      • How Internet Retailing Startups Grow
      • Customers and Digital Marketing
      • Influence and How Information Spreads
      • Pricing Strategies
      • The 7M
      • BRANDING: Effective Brand Communications Strategies and Repositioning Strategies
      • Brand Messaging & Communication
      • Brand Elements: Choosing a Brand Name
      • Brand Elements: Color & Taglines
      • Brand Elements: Packaging
      • Brand Elements: Persuasion
      • Repositioning a Brand
    • Introduction to Financial Accounting
      • 1.1.1: Financial Reporting Overview
      • 1.1.2: Financial Reporting Example
    • Managing Social and Human Capital
      • Professor Cappelli and Professor Useem Introductions
    • Introduction to Corporate Finance
      • Time Value of Money
      • Intuition and Discounting
      • Compounding
      • Useful Shortcuts
      • Taxes
      • Inflation
      • APR and EAR
      • Term Structure
      • Discounted Cash Flow: Decision Making
      • Discounted Cash Flow Analysis
      • Forecast Drivers
      • Forecasting Free Cash Flow
      • Decision Criteria
      • Sensitivity Analysis
      • Return on Investment
    • Introduction to Operations Management
    • Wharton Business Foundations Capstone
  • Artificial Intelligence Career Program - deeplearning.ai
    • Machine Learning
      • Introduction to Machine Learning
      • Supervised Learning
      • Unsupervised Learning
      • Model Representation - Linear Regression
      • Cost Function
      • Gradient Descent
      • Gradient Descent For Linear Regression
      • Linear Algebra
    • Deep Learning
    • Neutral Networks and Deep Learning
      • Introduction to Deep Learning
      • What is a neural network?
      • Supervised Learning with Neural Networks
      • Why is Deep Learning taking off?
      • About this Course
      • Binary Classification
      • Logistic Regression
      • Gradient Descent
      • Derivatives
      • Computation graph
      • Derivatives with a Computation Graph
      • Logistic Regression Gradient Descent
      • Vectorization
      • Vectorizing Logistic Regression
      • Vectorizing Logistic Regression's Gradient Output
      • Broadcasting in Python
      • A note on python/numpy vectors
      • Explanation of logistic regression cost function (optional)
      • Neural Networks Overview
      • Neural Network Representation
      • Computing a Neural Network's Output
      • Vectorizing across multiple examples
      • Activation functions
      • Derivatives of activation functions
      • Gradient descent for Neural Networks
      • Backpropagation intuition (optional)
      • Random Initialization
      • Deep L-layer neural network
      • Forward Propagation in a Deep Network
      • Getting your matrix dimensions right
      • Why deep representations?
      • Building blocks of deep neural networks
      • Forward and Backward Propagation
      • Parameters vs Hyperparameters
      • What does this have to do with the brain?
    • Convolutional Neural Networks
      • Computer Vision
      • Edge Detection Example
      • Padding
      • Strided Convolutions
      • Convolutions Over Volume
      • One Layer of a Convolutional Network
      • Simple Convolutional Network Example
      • Pooling Layers
      • CNN Example - Fully Connected Layers
      • Why Convolutions?
    • Neural Network Theory [ETH]
    • Natural Language Processing
    • Computer Vision
  • IBM Data Science Professional Certificate
    • What is Data Science?
    • Open Source tools for Data Science
    • Data Science Methodology
    • Python for Data Science and AI
    • Databases and SQL for Data Science
    • Data Analysis with Python
    • Data Visualization with Python
    • Machine Learning with Python
    • Applied Data Science Capstone
  • Data Analytics
    • Python for Data Analysis
    • Data Structure and Algorithms
  • Programming Language
    • Python
    • R
    • SQL
    • C++
    • C
    • Java
    • HTML
  • 机器学习工程师
  • 商业数据分析
Powered by GitBook
On this page

Was this helpful?

  1. Artificial Intelligence Career Program - deeplearning.ai
  2. Neutral Networks and Deep Learning

Computation graph

你已经学过神经网络的计算过程 由正向传播来进行前向计算 来计算神经网络的输出 以及反向传播计算 来计算梯度或微分 计算图解释了为什么以这种方式来组织 在这个视频中,我们来看一个例子 为了演示计算图 我们用一个简单的逻辑回归或单层神经网络 我们试图计算一个函数J 它有三个变量a,b和c 函数为3(a+bc) 计算这个函数分为三个步骤 首先你需要计算bc 设bc=u 然后你需要计算V=a+u 设V=a+u,最后 输出则变为J=3V 这是最终你需要计算的函数J 接下来我们可以将这三个步骤画作计算图 先画出三个变量a,b,c 首先,计算U=bc 画一个矩形代表它 他的输入是b和c 然后计算V=a+u 它的输入是 刚刚计算的u和变量a 最终我们得到J=3V 举一个具体的例子 a=5,b=3,c=3,u=bc=6 V=a+u=11,J=3V,所以J=33 事实上,你也可以验证 3(5+32) 如果你展开它 就得到J的值是33 所以,计算图用起来很方便 当有一些特殊的输出变量时 例如这个例子中 你想要优化的J 在逻辑回归i的情况下 当然,J是我们试图最小化的cost函数 我们在这个例子中看到的是 从左到右传播 可以计算J的值 在下一组幻灯片中,我们可以看到 从右向左的这个过程 和这个蓝色箭头的过程相反 这会是用于计算导数最自然的方式 因此概括一下,流程图 是用蓝色箭头画出来的,从左向右的计算 看看下一个视频怎么做 这个反向红色箭头,也就是从右到左的导数计算 我们下一节再见

PreviousDerivativesNextDerivatives with a Computation Graph

Last updated 5 years ago

Was this helpful?