The project is in an early stage of development (Public Alpha). You can already use the bot builder to build and test your chatbots. You can also integrate it to your website. Feedback is welcome.
Thanks to this kind of technology the graph don't have to wait for user input to generate new conclusions. Imagine the following situation: The bot can hand over a question with a few suggestions to the chat. Then the bot will wait for user input. But while it waits for user input the context and the situation can change. So the current question or topic can become useless or subordinate. So for example while the bot waits a timer can expire or an API or the website can generate some other contextual information. The bot then will take all the new information into account and will for example prefer suddenly another question. Its like in real life: While you're talking to your friend about getting into the pool today, it could suddenly start raining. So the whole topic becomes useless. The topic will then change an you will maybe continue talking about the newest cinema films.
In short, there are no active blocking nodes as in many other chatbots. Each time an underlying piece of information changes, the graph will include the entire context again. This means that past conclusions, decisions and actions are questioned again. Through this technology, different tasks can be parallelized and the conversation becomes more dynamic. It also makes no difference how and when information gets into the graph. Information can be actively inquired, responded by APIs, provided before the conversation starts, or injected by external processes at any time. This means that a logic graph responds to the whole environment rather than just the user input. New information affects the whole graph, not just the next question or action.
Since this bot runs completely in the user's browser, it will not communicate any user input or information to any servers by default. Yes of course: You can use API-Nodes and Webhocks. But you have to build such a structure first. So it's your descission if you want to send user input to somewhere. Also, you can decide on your own which information will be sent to which servers.
Yes! Absolutely. You can just use the bot builder on the Wanderer.ai website for free. You don't have to register or pay to use this application. This becomes only relevant if you plan to run the builder or chatbots under your own public domains or inside your Apps and if you use professional plugins. In this case you have to purchase a license.
This project uses a dual licensing model. Most of the packages are available under the AGPLv3. But some plugins require a professional license. That means that you have to purchase a license if you plan to run the bot builder or single chatbots on your own website using this professional plugins. Use the professional plugins for free as long as you use them with the Wanderer.ai domain.
The pricing model is not very clear right now. But I want to keep it simple. So you have to purchase a professional license for each domain a chatbot with professional plugins is running.
At the moment Wanderer.ai will not store any data for you. So you have to download your flows. You can then store them wherever you want. Store your conversation flow on your device or upload it to GitHub to share it with others.
No. The core of the project uses a logic graph to find the next meaningful action based on a complex context. This technology combines some kind of flow programming with traversal algorithms and graph theory. Nevertheless, deep learning and neuronal networks can be coupled through APIs to generate new information for the graph.
You should first ask yourself what the benefit of your chatbot should be. Should the chatbot primarily help the user to solve their own problem? Or should the bot influence the user with invisible and secret information to reach a hidden goal defined by the company? In general, I believe that transparent chatbots will create more trust between customers and a company.
This project uses logic-less templates powered by Mustache.js. That means that you can output data to your messages. But logic is not supported inside messages itself. You cannot declare variables or call functions from inside templates. The reason is, that the logic should be depicted through the graph structure and not inside hidden and cumbersome templates. This is part of the strict design pattern of Wanderer.ai. It would also be a major security hole.
No. At the moment this is not supported. This will maybe supported in the future.
Yes. The bot can communicate with all sorts of APIs that will response pure JSON.
That's a really good question. In general, API communication should be reduced as much as possible. This creates systems that are less dependent, more data protection friendly and easier to maintain. However, Wanderer.ai will not stop you from targeting APIs. If you need API keys, you should first redirect the requests through your own server, which attaches the keys to the request.
No. Since Wanderer.ai currently only runs in the browser front end, the connection to other chat channels is technically not possible at the moment. If the project and the technology are successful, I will think about a server variant. But for the moment this is a browser only solution.
This is not possible at the moment because the software is privacy by design. But you can however ask the users for data donations and you can then collect the user information through an API for example. But you can also use analytics services like Google Analytics or etracker for example to get insights to the users flow.
NLU is a really big topic. So in general you can use all Services, that are reachable through an API. But there are plans to implement a open source NLU system. So you can talk more freely to the bot in the future.