An open-source NLP research library, built on PyTorch. AllenNLP makes it easy to design and evaluate new deep learning models for nearly any NLP problem, along with the infrastructure to easily run them in the cloud or on your laptop. AllenNLP includes reference implementations of high quality models for both core NLP problems (e.g. semantic role labeling) and NLP applications (e.g. textual entailment).
Extract entities, relationships, and context from text using the Sapien Language Engine. This general natural language understanding API is used for text analytics or for building robust conversational interfaces.
Neural Models of Syntax.
A TensorFlow toolkit for deep learning powered natural language understanding (NLU).
Speech and natural language tools
Open source machine learning tools for developers and product teams to expand bots beyond answering simple questions.
Rasa Core's primary purpose is to help you build contextual, layered conversations with lots of back-and-forth. To have a real conversation, you need to have some memory and build on things that were said earlier. Rasa Core lets you do that in a scalable way.
RASA NLU is an open-source tool for intent classification and entity extraction. You can think of it as a set of high level APIs for building your own language parser using existing NLP and ML libraries.
With API.AI, you can create conversational scenarios within minutes, then build advanced dialogues to manage the conversation flow with the user. Once you build your agent, you can integrate it with SDKs or use our one-click integration modules. When your product is launched, it constantly improves with machine learning and can be updated in real-time based on user interactions.
Natural Language Interface for apps and devices. Build bots easily. You tell us what your user said, we tell you what your bot should do next. Your users give us voice or text, you get back structured data. It's that simple.
WordNet® is a large lexical database of English. Nouns, verbs, adjectives and adverbs are grouped into sets of cognitive synonyms (synsets), each expressing a distinct concept. Synsets are interlinked by means of conceptual-semantic and lexical relations. The resulting network of meaningfully related words and concepts can be navigated with the browser. WordNet is also freely and publicly available for download. WordNet's structure makes it a useful tool for computational linguistics and natural language processing.
ConceptNet is a semantic network containing lots of things computers should know about the world, especially when understanding text written by people.
It is built from nodes representing words or short phrases of natural language, and labeled relationships between them. (We call the nodes "concepts" for tradition, but they'd be better known as "terms".) These are the kinds of relationships computers need to know to search for information better, answer questions, and understand people's goals.
The Natural Language Toolkit (NLTK) is a Python package for natural language processing.
It provides easy-to-use interfaces to over 50 corpora and lexical resources such as WordNet, along with a suite of text processing libraries for classification, tokenization, stemming, tagging, parsing, and semantic reasoning, wrappers for industrial-strength NLP libraries.
TextBlob is a Python (2 and 3) library for processing textual data. It provides a simple API for diving into common natural language processing (NLP) tasks such as part-of-speech tagging, noun phrase extraction, sentiment analysis, classification, translation, and more.
Language detection as a service. A precision of 99% over 50+ languages !
A simple and scalable way to classify automatically text by language
LingPipe is tool kit for processing text using computational linguistics