Deep Learning Framework Power Scores 2018

By Jeff Hale

Who’s on top in usage, interest, and popularity?

Deep learning continues to be the hottest thing in data science. Deep learning frameworks are changing rapidly. Just five years agonone of the leaders other than Theano were even around.

I wanted to find evidence for which frameworks merit attention, so I developed this power ranking. I used 11 data sources across 7 distinct categories to gauge framework usage, interest, and popularity.Then Iweighted and combined the data in this Kaggle Kernel.

UPDATE SEPT 20, 2018: Due to popular demand, I expanded the frameworks evaluated to include Caffe, Deeplearning4J, Caffe2, and Chainer. Now all deep learning frameworks with more than 1% reported usage on KDNuggets usage survey are included.

UPDATE SEPT 21, 2018: I made a number of methodological improvements in several of the metrics.

Without further ado, here are the Deep Learning Framework Power Scores:

While TensorFlow is the clear winner, there were some surprising findings. Let’s dive in!

The Contenders

All of these frameworks are open source. All except one work with Python, and some can work with R or other languages.

TensorFlow is the undisputed heavyweight champion. It has the most GitHub activity, Google searches, Medium articles, books on Amazon and ArXiv articles.It also has themost developers using it and is listed in the most online job descriptions. TensorFlow is backed by Google.

Keras has an “API designed for human beings, not machines.” It is the second most popular framework in nearly all evaluation areas. Keras sits on top of TensorFlow, Theano, or NLTK. Start with Keras if you are new to deep learning.

PyTorch is the third most popular overall framework and the second most popular stand-alone framework. It is younger than TensorFlow and has grown rapidly in popularity. It allows customization that TensorFlow does not. It has the backing of Facebook.

Caffe is the fourth most popular framework. It has been around for nearly five years. It is relatively in demand from employers and often mentioned in scholarly articles, but has little reported recent usage.

Theano was developed at the University of Montreal in 2007 and is the oldest significant Python deep learning framework. It has lost much of its popularity and its leader stated that major releases were no longer on the roadmap. However, updates continue to be made. Theano still the fifth highest scoring framework.

MXNET is incubated by Apache and used by Amazon. It is the sixth most popular deep learning library.

CNTK is the Microsoft Cognitive Toolkit. It reminds me of many other Microsoft products in the sense that it is trying to compete with Google and Facebook offerings and is not winning significant adoption.

Deeplearning4J, also called DL4J, is used with the Java language. It’s the only semi-popular framework not available in Python. However, you can import models written with Keras to DL4J. This was the only framework where two different search terms occasionally had different results. I used the higher number for each metric. As the framework scored quite low, this made no material difference.

Caffe2 is another Facebook open source product. It builds on Caffe and is now being housed in the PyTorch GitHub repository. Because it no longer has its own repository I used the GitHub data from its old repository.

Chainer is a framework developed by the Japanese company Preferred Networks. It has a small following.

FastAI is built on PyTorch. Its API was inspired by Keras and requires even less code for strong results. FastAI is bleeding edge as of mid-Sept 2018. It’s undergoing a rewrite for version 1.0 slated for October 2018 release. Jeremy Howard, the force behind FastAI has been a top Kaggler and President of Kaggle. He discusses why FastAI switched from Keras to make their own framework here.

FastAI is not yet in demand for careers nor is it being used widely. However, it has a large built-in pipeline of users through its popular free online courses. It is also both powerful and easy to use. Its adoption could grow significantly.


I chose the following categories to provide a well-rounded view of popularity and interest in deep learning frameworks.

The evaluation categories are:

  • Online Job Listings
  • KDnuggets Usage Survey
  • Google Search Volume
  • Medium Articles
  • Amazon Books
  • ArXiv Articles
  • GitHub Activity

Searches were performed Sept. 16 to Sept. 21, 2018. Source data is in this Google sheet.

I used the plotly data visualization library and Python’s pandas library to explore popularity. For the interactive plotly charts, see my Kaggle Kernel here.

Online Job Listings

What deep learning libraries are in demand in today’s job market? I searched job listings on LinkedIn, Indeed, Simply Hired, Monster, and Angel List.

TensorFlow is the clear winner when it comes to frameworks mentioned in job listings. Learn it if you want a job doing deep learning.

I searched using the term machine learning followed by the library name. So TensorFlow was evaluated with machine learning TensorFlow. I tested several search methods and this one gave the most relevant results.

An additional keyword was necessary to differentiate the frameworks from unrelated terms because Caffe can have multiple meanings.


KDnuggets, a popular data science website, polled data scientists around the world on the software that they used. They asked:

What Analytics, Big Data, Data Science, Machine Learning software you used in the past 12 months for a real project?

Here are the results for the frameworks in this category.

Keras showed a surprising amount of use — nearly as much as TensorFlow. It’s interesting that US employers are overwhelmingly looking for TensorFlow skills, when — at least internationally — Keras is used almost as frequently.

This category is the only one that includes international data because it would have been cumbersome to include international data for the other categories.

KDnuggets reported several years of data. While I used 2018 data only in this analysis, I should note that Caffe, Theano, MXNET, and CNTK saw usage fall since 2017.

Google Search Activity

Web searches on the largest search engine are a good gauge of popularity. I looked at search history in Google Trends over the past year. Google doesn’t provide absolute search numbers, but it does provide relative figures.

I updated this article Sept. 21, 2018 so these scores would include worldwide searches in the Machine Learning and Artificial Intelligence category for the week ended Sept. 15, 2018. Thanks to François Chollet for his suggestion to improve this search metric.

Keras was not far from TensorFlow. PyTorch was in third and other frameworks had relative search volume scores at or below four. These scores were used for the power score calculations.

Let’s look briefly at how search volume has changed over time to provide more historical context. The chart from Google directly below shows searches over the past two years.

TensorFlow = red, Keras = yellow, PyTorch = blue, Caffe = green

Searches for Tensor Flow haven’t really been growing for the past year, but Keras and PyTorch have seen growth. Google Trends allows only five terms to be compared simultaneously, so the other libraries were compared on separate charts. None of the other libraries showed anything other than minimal search interest relative to TensorFlow.


I included several publication types in the power score. Let’s look at Medium articles first.

Medium Articles

Medium is the place for popular data science articles and guides. And you’re here now — fantastic!

Finally a new winner. In terms of mentions in Medium articles,Keras broke the tape ahead of TensorFlow. FastAI outperformed relative to its usual showing.

I hypothesize that these results might have occurred because Keras and FastAI are beginner friendly. They have quite a bit of interest from new deep learning practitioners, and Medium is often a forum for tutorials.

I used Google site search of over the past 12 months with the framework name and “learning” as the keyword. This method was necessary to prevent incorrect results for the term “caffe”. It had the smallest reduction in articles of several search options.

Now let’s see which frameworks have books about them available on Amazon.

Amazon Books

I searched for each deep learning framework on under Books->Computers & Technology.

TensorFlow for the win again. MXNET had more books than expected and Theano had fewer. PyTorch had relatively few books, but that may be because of the framework’s youth. This measure is biased in favor of older libraries because of the time it takes to publish a book.

ArXiv Articles

ArXiv is the online repository where most scholarly machine learning articles are published. I searched for each framework on arXiv using Google site search results over the past 12 months.

More of the same from TensorFlow for scholarly articles. Notice how much more popular Keras was on Medium and Amazon than in scholarly articles. Pytorch was second in this category, showing its flexility for implementing new ideas. Caffe also performed relatively well.

GitHub Activity

Activity on GitHub is another indicator of framework popularity. I broke out stars, forks, watchers, and contributors in the charts below because they make more sense separately than combined.

TensorFlow is clearly the most popular framework on GitHub, with a whole lot of engaged users. FastAI has a decent following considering it isn’t even a year old. It’s interesting to see that contributor levels are closer for all of the frameworks than the other three metrics.

After gathering and analyzing the data it was time to consolidate it into one metric.

Power Scoring Procedure

Here’s how I created the power score:

  1. Scaled all features between 0 and 1.
  2. Aggregated Job Search Listings and GitHub Activity subcategories.
  3. Weighted categories according to the weights below.

As shown above,Online Job Listings and KDnuggets Usage Survey make up half of the total score, while web searches, publications, and GitHub attention make up the other half. This split seemed like the most appropriate balance of the various categories.

4. Multiplied weighted scores by 100 for comprehensibility.

5. Summed category scores for each framework into a single power score.

Here’s the data:

Here are the scores after weighting and aggregating the subcategories.

And here’s the pretty chart again showing the final power scores.

100 is the highest possible score, indicating first place in every category. TensorFlow nearly achieved a 100, which was not surprising after seeing it at or near the top of every category. Keras was a clear second.

To play with the charts interactively or fork the Jupyter Notebook, please head to this Kaggle Kernel.


For now, TensorFlow is firmly on top. It seems likely to continue to dominate in the short term. Given how quickly things move in the deep learning world though, that may change.

Time will tell if PyTorch surpasses TensorFlow as React surpassed Angular. The frameworks may be analogous.Both PyTorch and React are flexible frameworks backed by Facebook and often considered easier to use than their Google-backed competitors.

Will FastAI gain users outside its courses? It has a large pipeline of students, an even more beginner-friendly API than Keras.

What do you think the future holds? Please share your thoughts below.

Suggestions for Learners

If you are considering learning one of these frameworks and have Python, numpy, pandas, sklearn, and matplotlib skills, I suggest you start with Keras. It has a large user base, is in demand by employers, has lots of articles on Medium, and has an API that is easy to use.

If you already know Kerasit might be tricky to decide on the next framework to learn.I suggest you pick either TensorFlow or PyTorch and learn it well so you can make great deep learning models.

TensorFlow is clearly the framework to learn if you want to master what is in demand. But PyTorch’s ease of use and flexibility are making it popular for researchers. Here’s a Quora discussion of the two frameworks.

Once you have those frameworks under your belt, I suggest you keep an eye on FastAI. Check out its free online course if you want to learn both basic and advanced deep learning skills. FastAI 1.0 promises to allow you to implement the latest deep learning strategies easily and to iterate quickly.

No matter which frameworks you choose, I hope you now have a better understanding of which deep learning frameworks are most in demand, most in use, and most written about.

0 0 votes
Article Rating
Notify of
Newest Most Voted
Inline Feedbacks
View all comments
5 years ago

Wham bam thank you, ma’am, my queonists are answered!

Adidas NMD Men Women Fur High Classic

I would like to express some thanks to this writer for bailing me out of this scenario. Just after exploring throughout the internet and seeing ways that were not powerful, I believed my entire life was gone. Being alive devoid of the answers to the issues you’ve sorted out by means of this blog post is a crucial case, and the ones that would have in a negative way affected my career if I hadn’t noticed your website. Your own personal know-how and kindness in maneuvering almost everything was helpful. I don’t know what I would’ve done if I had… Read more »

5 years ago

My wife and i got contented Raymond could do his preliminary research from the precious recommendations he gained through the web pages. It is now and again perplexing to just possibly be making a gift of guidelines which usually other folks might have been making money from. And we all already know we have the blog owner to thank for this. Most of the explanations you have made, the simple blog menu, the relationships you give support to create – it is many powerful, and it is helping our son and our family understand that topic is fun, and that’s… Read more »

5 years ago

I precisely desired to thank you very much once again. I am not sure the things I would’ve done in the absence of the actual smart ideas documented by you directly on my subject. This has been a very fearsome problem in my opinion, but viewing a new professional approach you managed it forced me to leap over contentment. Extremely thankful for the assistance and in addition expect you realize what a powerful job that you’re providing educating many others all through a blog. I am sure you have never got to know any of us.

yeezy boost 350 v2
5 years ago

I definitely wanted to post a comment in order to say thanks to you for these awesome pointers you are giving on this site. My incredibly long internet look up has at the end of the day been honored with wonderful strategies to go over with my relatives. I ‘d state that that many of us site visitors are rather fortunate to be in a fantastic community with many marvellous people with valuable tips and hints. I feel truly grateful to have discovered the website page and look forward to really more awesome moments reading here. Thanks once again for… Read more »

5 years ago

I want to express my admiration for your generosity supporting people that should have assistance with this one content. Your very own commitment to getting the solution throughout appeared to be particularly beneficial and has frequently made guys much like me to reach their pursuits. This valuable information entails a lot a person like me and substantially more to my mates. Warm regards; from each one of us.

adidas ultra boost 3.0

Needed to write you this little bit of remark in order to give many thanks over again just for the magnificent techniques you have discussed on this site. This is really seriously generous with you to supply freely what exactly some people would have supplied as an electronic book to get some bucks on their own, notably considering the fact that you might have tried it in the event you desired. The suggestions as well acted to become good way to know that most people have the identical passion just like mine to know somewhat more with regard to this… Read more »

cheap nfl jerseys china

I and also my pals appeared to be reading the excellent tips on your web site and so all of a sudden came up with an awful suspicion I had not expressed respect to the site owner for them. Those young boys were as a consequence thrilled to see all of them and now have honestly been taking pleasure in them. Thank you for truly being quite helpful and also for picking out certain wonderful themes millions of individuals are really eager to know about. My honest apologies for not expressing gratitude to earlier.