I have seen this question on forums, and it also comes in the following versions:
ManyChat vs Dialogflow
FlowXO vs Dialogflow
GoBot vs Dialogflow
Turns out, the answers to these questions have an important thing in common:
Conditional Logic Bot Services vs NLP based Bot Building Frameworks
In other words, they are not in competition. If anything, they should be cooperating!
Conditional Logic Bots
A conditional logic bot is one where you basically map out a scripted conversation. Chatfuel is a good example of a service which lets you create conditional logic bots. The end user "talking" to the bot is usually implemented by clicking on a button, or selecting something from a list. The words "typed" by the user appears on the screen not by the user actually typing anything, but rather by using their mouse.
Now, there are places where people do type in their words even in a conditional logic bot, but it is usually a response to the bot's question (e.g. what is your name? what is your email address?) and not a question coming from the user.
In other words, the conditional logic bots do not really make an effort to understand  what the user is typing.
Bot frameworks such as Dialogflow, on the other hand, are primarily built for making sense of what the user types. The topic is a bit too vast to cover in a single article, but you will notice that all the bot frameworks usually have features which are similar to the core features already built into Dialogflow: intents, entities, contexts and webhooks.
Adding 'AI' to a bot building service
I noticed that Chatfuel is trying to add some 'AI' into its service. They plan to implement this by trying to 'understand' what the user is typing.
There is a small problem with this approach: it breaks the actual paradigm which makes bot building frameworks do their jobs - i.e. without using 'entities' and 'contexts', it is practically impossible to build out a smart chatbot. 
In the example above, you can see that they are directly mapping user's words to the action that the Chatfuel bot must take.
However, if you are only vaguely familiar with Dialogflow entities, you know that fitting the concept of system entities is much harder using the approach taken by Chatfuel. Developer entities (defined by the developer) will be even more problematic to fit into the Chatfuel's UI, and I think user entities are pretty much impossible. 
And then we have contexts.
The Nemo bot
I have a small quiz on my website where I ask what contexts are used for. Contexts are one of the most easily misunderstood concepts in Dialogflow, and I think the community is still trying to figure out good ways to explain it.
In a nutshell, contexts add memory to your chatbot session so you don't end up with Nemo bots.
And contexts, regardless of how a given bot framework implements the feature, is one of the most vital cogs in the chatbot machine. Without a good way to both specify them and use them in your bot, it will be very hard to go beyond Nemo bots.
And the Chatfuel's UI is going to become a challenge if they try to add contexts into their service, as I see a lot of impedance mismatch .
This is why I think cooperation is a good choice here. And to be clear, what I say about Dialogflow applies equally to other bot building frameworks such as Microsoft LUIS, IBM Watson, wit.ai etc. But I use Dialogflow as the example because that's the one I know about.
Since Dialogflow exposes an API, you can build an agent inside Dialogflow and connect it to your Chatfuel bot. In other words, to add some 'brains' to your Chatfuel bot, it is much simpler in the long run if you can connect it to Dialogflow's service on the back end.
If you are interested in learning how to do this, you can take a look at this article.
In my view, unless the conditional logic bot services like Chatfuel and ManyChat implement reasonably robust mechanisms for adding 'developer entities' and 'contexts' into their bots, you cannot make their bots behave on par with what you can already build using Dialogflow.