Recently I start to tutor one paper (COMP700 Text and Vision Intelligence) for AUT students in the weekends.
This paper mostly involves in Computer Vision and Text Mining (Natural Language Processing).
The two aspects are common applications about Aritificial Intelligence.
So, in this blog, I will provide a brief agenda about Natural Language Processing for people who want to get kick-start in Artificial Intelligence area.
General Intro NLP
~Tasks to describe languages
~Neural network model
NLP with deep learning:
~Recurrent Neural Network
~Convolutional Neural Network
NLP at different levels:
NLP in industry:
What need to be done-representing:
~Wordnet: a resource containing lists of synonym sets and hypernyms
~One-hot: as discrete symbols
~Word embedding: A word’s meaning is given by the words that frequently appear close-by
Knowledge graph-semantic drive way:
Describing Language tasks:
~POS: Part Of Speech
~NER: Named Entity Recognition
~Algorithml:Rule-based taggers, Probabilistic tagger: HMM and Veterbi, Perceptron, Conditional model: CRF
Statistical Language Model:
~CBoW: Continuous Bag-of-Words Model
~Back propagation: What’s cost function, How cost function used updated parameters
~ReLu (rectified lieaner)
~Learn (multiple levels of) representation and an output from ‘raw’ inputs x
~Universal, learnable framework for representing world, visual and linguistic information
~Can learn unsupervised (from raw text) and supervised
~Why now popular:A large dataset, Faster machines and multicore CPU/GPU
~Why it works
Deep Learning Models:
~Recurrent Neural Networks: simple RNN, LSTM, GRU
~Generative Neural Networks
If you are interested in or have any problems with Natural Language Processing, feel free to contact me.
Or you can connect with me through my LinkedIn.