Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[PT] Adding weeks 3 to 6 #805

Merged
merged 7 commits into from
Apr 25, 2022
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions docs/_config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -757,6 +757,24 @@ pt:
- path: pt/week02/02-1.md
- path: pt/week02/02-2.md
- path: pt/week02/02-3.md
- path: pt/week03/03.md
sections:
- path: pt/week03/03-1.md
- path: pt/week03/03-2.md
- path: pt/week03/03-3.md
- path: pt/week04/04.md
sections:
- path: pt/week04/04-1.md
- path: pt/week05/05.md
sections:
- path: pt/week05/05-1.md
- path: pt/week05/05-2.md
- path: pt/week05/05-3.md
- path: pt/week06/06.md
sections:
- path: pt/week06/06-1.md
- path: pt/week06/06-2.md
- path: pt/week06/06-3.md

################################## Hungarian ###################################
hu:
Expand Down
487 changes: 487 additions & 0 deletions docs/pt/week03/03-1.md

Large diffs are not rendered by default.

476 changes: 476 additions & 0 deletions docs/pt/week03/03-2.md

Large diffs are not rendered by default.

285 changes: 285 additions & 0 deletions docs/pt/week03/03-3.md

Large diffs are not rendered by default.

40 changes: 40 additions & 0 deletions docs/pt/week03/03.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
---
lang: pt
lang-ref: ch.03
title: Semana 3
translator: Leon Solon
translation-date: 14 Nov 2021
---

<!--
## Lecture part A
-->

## Aula parte A

<!--We first see a visualization of a 6-layer neural network. Next we begin with the topic of Convolutions and Convolution Neural Networks (CNN). We review several types of parameter transformations in the context of CNNs and introduce the idea of a kernel, which is used to learn features in a hierarchical manner. Thereby allowing us to classify our input data which is the basic idea motivating the use of CNNs.
-->

Iniciamos com a visualização de uma rede neural de 6 camadas. A seguir, começamos com o tópico de Convoluções e Redes Neurais Convolucionais (CNN). Revisamos vários tipos de transformações de parâmetros no contexto de CNNs e apresentamos a ideia de um kernel, que é usado para aprender características de maneira hierárquica. Assim, podemos classificar nossos dados de entrada, que é a ideia básica que motiva o uso de CNNs.

<!--
## Lecture part B
-->

## Aula parte B

<!--We give an introduction on how CNNs have evolved over time. We discuss in detail different CNN architectures, including a modern implementation of LeNet5 to exemplify the task of digit recognition on the MNIST dataset. Based on its design principles, we expand on the advantages of CNNs which allows us to exploit the compositionality, stationarity, and locality features of natural images.
-->

Damos uma introdução de como as CNNs evoluíram ao longo do tempo. Discutimos em detalhes diferentes arquiteturas de CNN, incluindo uma implementação moderna de LeNet5 para exemplificar a tarefa de reconhecimento de dígitos no conjunto de dados MNIST. Com base em seus princípios de design, expandimos as vantagens das CNNs, o que nos permite explorar as características de composicionalidade, estacionariedade e localidade de imagens naturais.

<!--
## Practicum
-->

## Prática

<!--Properties of natural signals that are most relevant to CNNs are discussed in more detail, namely: Locality, Stationarity, and Compositionality. We explore precisely how a kernel exploits these features through sparsity, weight sharing and the stacking of layers, as well as motivate the concepts of padding and pooling. Finally, a performance comparison between FCN and CNN was done for different data modalities.
-->

As propriedades dos sinais naturais que são mais relevantes para as CNNs são discutidas em mais detalhes, a saber: Localidade, Estacionaridade e Composicionalidade. Exploramos precisamente como um kernel explora essas características por meio de dispersão, compartilhamento de peso e empilhamento de camadas, além de motivar os conceitos de preenchimento (padding) e pooling. Finalmente, uma comparação de desempenho entre FCN (Redes Convolucionais Completas) e CNN foi feita para diferentes modalidades de dados.
Loading