-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* adding weeks 03 to 06 * [PT] Adding weeks 03 to 06 * [PT] fix: Alf reviews for PR #805 > > Co-authored-by: Felipe Schiavon <felipeschiavondeoliveira@gmail.com> Co-authored-by: Bernardo Lago <bernardolago@gmail.com>" Co-authored-by: Alfredo Canziani <alfredo.canziani@gmail.com>
- Loading branch information
Showing
22 changed files
with
21,642 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Large diffs are not rendered by default.
Oops, something went wrong.
Large diffs are not rendered by default.
Oops, something went wrong.
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,40 @@ | ||
--- | ||
lang: pt | ||
lang-ref: ch.03 | ||
title: Semana 3 | ||
translator: Leon Solon | ||
translation-date: 14 Nov 2021 | ||
--- | ||
|
||
<!-- | ||
## Lecture part A | ||
--> | ||
|
||
## Aula parte A | ||
|
||
<!--We first see a visualization of a 6-layer neural network. Next we begin with the topic of Convolutions and Convolution Neural Networks (CNN). We review several types of parameter transformations in the context of CNNs and introduce the idea of a kernel, which is used to learn features in a hierarchical manner. Thereby allowing us to classify our input data which is the basic idea motivating the use of CNNs. | ||
--> | ||
|
||
Iniciamos com a visualização de uma rede neural de 6 camadas. A seguir, começamos com o tópico de Convoluções e Redes Neurais Convolucionais (CNN). Revisamos vários tipos de transformações de parâmetros no contexto de CNNs e apresentamos a ideia de um kernel, que é usado para aprender características de maneira hierárquica. Assim, podemos classificar nossos dados de entrada, que é a ideia básica que motiva o uso de CNNs. | ||
|
||
<!-- | ||
## Lecture part B | ||
--> | ||
|
||
## Aula parte B | ||
|
||
<!--We give an introduction on how CNNs have evolved over time. We discuss in detail different CNN architectures, including a modern implementation of LeNet5 to exemplify the task of digit recognition on the MNIST dataset. Based on its design principles, we expand on the advantages of CNNs which allows us to exploit the compositionality, stationarity, and locality features of natural images. | ||
--> | ||
|
||
Damos uma introdução de como as CNNs evoluíram ao longo do tempo. Discutimos em detalhes diferentes arquiteturas de CNN, incluindo uma implementação moderna de LeNet5 para exemplificar a tarefa de reconhecimento de dígitos no conjunto de dados MNIST. Com base em seus princípios de design, expandimos as vantagens das CNNs, o que nos permite explorar as características de composicionalidade, estacionariedade e localidade de imagens naturais. | ||
|
||
<!-- | ||
## Practicum | ||
--> | ||
|
||
## Prática | ||
|
||
<!--Properties of natural signals that are most relevant to CNNs are discussed in more detail, namely: Locality, Stationarity, and Compositionality. We explore precisely how a kernel exploits these features through sparsity, weight sharing and the stacking of layers, as well as motivate the concepts of padding and pooling. Finally, a performance comparison between FCN and CNN was done for different data modalities. | ||
--> | ||
|
||
As propriedades dos sinais naturais que são mais relevantes para as CNNs são discutidas em mais detalhes, a saber: Localidade, Estacionaridade e Composicionalidade. Exploramos precisamente como um kernel explora essas características por meio de dispersão, compartilhamento de peso e empilhamento de camadas, além de motivar os conceitos de preenchimento (padding) e pooling. Finalmente, uma comparação de desempenho entre FCN (Redes Convolucionais Completas) e CNN foi feita para diferentes modalidades de dados. |
Oops, something went wrong.