-
Notifications
You must be signed in to change notification settings - Fork 116
Week 7 Homework
koji edited this page Dec 19, 2017
·
20 revisions
- Arrow functions,
map()
,reduce()
: - Bayesian Classification:
- A2Z notes
- A Plan for Spam by Paul Graham
- JS Bayes Classifier (in progress)
- Word2Vec
- Allison Parrish Tutorial
- JS word2vec code (in progress)
- Tracery and CFG:
Create an exercise around one of the following three topics (or combine more than one!). Feel free to build on top of something you've done previously this semester or start something new. There are a lot of ideas below, just pick one small thing to try!
- Use the Bayes Classifier library to build a simple example that classifies text according to some training data. What happens with training data from user input vs. larger datasets from, for example, Project Gutenberg.
- Contribute to the Bayes Classifier library itself: list of issues
- Try one of these other two Bayes classifier libraries: TTEZEL, NODE NATURAL
- Create a visualizaton to explain Bayesian reasoning: For inspiration you might consider this Explaining Bayesian Problems video or The Monty Hall Problem.
- Using the color vectors example, compute the average color of a text or try one of the other ideas described in Allison Parrish's tutorial
- Contribute to the word2vec library itself: list of issues
- Compute the average vector for a chunk of text (sentence, paragraph, longer text). Then display the words closest to that average vector.
- Experiment with poetry generation using word vectors and average sentence vectors as in Allison Parrish's tutorial.
- Try generating word vectors by training off of your own data. Investigate gensim
- Invent your own grammar and use one of the existing examples to generate text with it. There are lots of example grammars you can find online, though you will likely have to reformat them to be the grammar syntax from my example or JSON.
- Rework any of the example programs to use something other than text (or, at least, text that represents language) as its basic unit. For example: musical notes, songs in playlists, pixels in an image, shapes in a drawing, etc.
- Build a grammar that pulls its terminal words from Wordnik.
- Build a grammar based on a source text as demonstrated here — source code.
- Getting results from a context-free-grammar can be tricky. Consider a twitter bot, and the conciseness of 140 characters. Something short and sweet, highly structured ideas may work well:
- A coffee drink order generator.
- An apology generator.
- An ITP project idea generator.
- A knock knock joke generator.
- add your question here (name)
- name, links, etc.
- Koji, learn about nltk and gensim
- Nouf -
- Yeseul --
- Richard -- Word2Vec - 2Color....2WordAgain
- Michael -- Forest : Tree - (context free music)
- Swapna
- Melissa --
- Tong --
- Jenn -- Good News Generator, using Tracery example
- Laura --
- Kenzo --
- Nitish --
- Paula -- Even dreams think
- Annie --
- Utsav --
- Zach --