Google Colab

Deep learning models often require high amount of computation power in order to train. This requirement is often met via parallel-processing the execution of these model and thus reducing the training time by several orders of magnitude. Due to this benefit, researches have shifted from CPU based processing to a more GPU based approach (like the NVIDIA Tesla) when training their models. However, these GPUs and GPU clusters are often quite expensive and thus inaccessible to most.

Read More

Hand Coding Our Very Own Type Token Ratio Generator

ABOUT

Natural Language Processing is a branch of Artificial Intelligence where we aim to take text or speech in natural languages (i.e. English or Hindi) and convert them into computer manipulable format. We then use the converted data in various fields such as Analytics, Machine Translation, Part of Speech Tagging, Chatbots, Summarization etc. The possibilities of using natural languages to extract data and then quantify them is endless.

Let us take the example of using your Google Assistant.

Read More