Joshua Kim
Joshua Kim
PhD Candidate in NLP and Deep Learning; Data Science Consultant
  • Twitter
  • LinkedIn
  • Github
Skip to content
  • Home
  • Natural Language Processing
  • Statistics

Natural Language Processing

There are 5 posts filed in Natural Language Processing (this is page 1 of 1).

Conversation Datasets and Learning Character Styles from Movie Dialogues

As Artificial Intelligence continues to push its boundaries on cognition, it takes on a challenge that we humans do so naturally – to understand and respond using natural language.

Continue Reading →
in Natural Language Processing | 897 Words | Comment

Exploring emotion combinations using word2vec

In this blog post, we explore two sets of emotion combinations using word2vec. Specifically, one posited by Robert Plutchik in 1980 and the other popular media chart featured in vox.com using characters from Inside Out.

Continue Reading →
in Natural Language Processing | 1,016 Words | 1 Webmention | Comment

Distributed representation of anything

In this review, we explore various distributed representations of anything we find on the Internet – words, paragraphs, people, photographs. These representations can be used for a variety of purposes as illustrated below. We try to select subjects that seem disparate, instead of providing a comprehensive review of all applications of distributed representation.

Continue Reading →
in Natural Language Processing | 940 Words | Comment

Understanding how Convolutional Neural Network (CNN) perform text classification with word embeddings

When learning to apply CNN on word embeddings, keeping track of the dimensions of the matrices can be confusing. The aim of this short post is to simply to keep track of these dimensions and understand how CNN works for text classification.

Continue Reading →
in Natural Language Processing | 821 Words | 3 Webmentions | 53 Comments

Introduction to Word Embeddings

Word embeddings are commonly used in many Natural Language Processing (NLP) tasks because they are found to be useful representations of words and often lead to better performance in the various tasks performed. Given its widespread use, this post seeks to introduce the concept of word embeddings to the prospective NLP practitioner.

Continue Reading →
in Natural Language Processing | 983 Words | 1 Webmention | Comment
Independent Publisher empowered by WordPress