Deep Dive on Hugging Face
The highway to AI/ML gold rush
Preface
Over the past few weeks, we have done a sweeping study of the AI/ML industry. Hugging Face stood out from our user surveys, as it is frequently mentioned with great user reviews. Some users even went to the extreme and claimed that Hugging Face was perfect and impeccable. As we believe that user reviews can best represent the true value of a product or an application, we decided to take a deep dive into Hugging Face and figure out what makes it so popular and well-recognized among users.
An AI/ML community and platform, Hugging Face gained its popularity with Transformers, a library of pre-trained models, and quality communities. Users can host and share machine learning models and dataset on Hugging Face, where they can also build, train, and deploy models. Today, Hugging Face offers 77,000 pre-trained models, with a focus on NLP models. The proportion of NLP models is 50%, down from 70% in early 2022, and is still decreasing. While the abundant NLP models make it the GitHub for NLP, Hugging Face aspires to become the GitHub for machine learning and make its way into other parts of the ML workflow.
By the time this article gets to print, Hugging Face has more than 1,440 contributors and 35,200 users on GitHub, with 71,800 stars and 16,400 forks. On average, over 50,000 users download models from Hugging Face each day. Today, Transformers is the fastest growing open-source project and Hugging Face has become one of the best-known AI/ML communities in history.
In May 2022, Hugging Face, with an estimated market cap of $2 billion, secured $100 million in Series C funding, which was led by Lux Capital with major participants from Sequoia US and Coatue. Hugging Face provides services for over 1000 clients, including Intel, Qualcomm, Pfizer, and Bloomberg. It was revealed that Hugging Faced racked up $10 million in earnings in 2021, and was close to break-even in mid-2022.
AI/ML is one of the most important markets in the decade to come. Hugging Face provides easy solutions for users to access AI/ML, with its core business lying in the upstream of the ML workflow. Meanwhile, it is also exploring ways to monetize the downstream business, including private models, dataset hosting, model inference and deployment, and AutoTrain. If its monetization attempts take off, Hugging Face will be likely to penetrate into the whole ML workflow as the center of AI/ML.
According to Clément, co-founder and CEO of Hugging Face, “I don’t really see a world where machine learning becomes the default way to build technology and where Hugging Face is the No. 1 platform for this, and we don’t manage to generate several billion dollars in revenue.” Brandon Reeves, a partner at Lux Capital, said, “if this vision can be delivered, Hugging Face is likely to become a $50 billion and even $100 billion company after going public.”
Contents
01 Thesis
02 What Is Hugging Face
03 Business Model
04 Market and Competition
05 Success Factors and Risks
06 Conclusion
07 Appendix: Interviewee Quotes
01. Thesis
As a community and platform for AI/ML, Hugging Face came to public attention with its NLP models initially. Today, Hugging Face is playing an important role in one of the most promising industries--AI/ML.
In the past decade, data has shown tremendous value as the technology of data storage, processing, analysis, and visualization evolves. However, there is still a lot to be tapped into. The real challenge of the data industry does not lie in data processing and analysis, but in the last mile, i.e. how to tackle people's problems with dynamic data efficiently in real time. In this process, AI/ML is the key.
NLP is an essential part of AI. Language contains enormous information. It is like the API of humanity, through which humans communicate with each other. We are living at a time when software is ubiquitous and human-machine interaction is inevitable. While we interacted with machines with codes and structured data in the past, NLP enables us to do the same thing in natural language today. With more and more models being built and developed on Transformer came the turning point of NLP. NLP is embracing substantial changes.
02. What Is Hugging Face?
Development History
An AI/ML community and platform, Hugging Face gained its popularity with Transformers, a library of pre-trained models, and quality communities. Users can host and share machine learning models and dataset on Hugging Face, where they can also build, train, and deploy models. Today, Hugging Face offers 77,000 pre-trained models, with a focus on NLP models. The proportion of NLP models is 50%, down from 70% in early 2022, and is still decreasing. While the abundant NLP models make it the GitHub for NLP, Hugging Face aspires to become the GitHub for machine learning and make its way into more parts of the ML workflow.
Source: Hugging Face
That said, this is not what its business was like when it was set up. In 2016, Hugging Face was established as an AI application company that built chatbots to chat with users. Users can send messages and even selfies to the chatbots. At its peak, the DAU was as high as 100,000 with a good user retention rate.
Source: TechCrunch
To train the bots' capabilities to process natural language, the team built a model hub to store NLP and ML models, while publishing some open-source models on GitHub.
In 2018, Google open-sourced the TensorFlow of BERT. Later, Hugging Face released a more user-friendly and convenient PyTorch version of BERT called "Pytorch-Pretrained-BERT". The team used a more common PyTorch framework to replicate the performance of BERT and made pre-trained models available for downloading, enabling developers without enough computing power to complete state-of-art-fine-tuning within minutes.
In July 2019, Pytorch-Pretrained-BERT had 6 pre-trained language models -- BERT, GPT, GPT-2, Transformer-XL, XLNET, XLM -- on Repo, and Hugging Face changed its name to PyTorch-Transformers, which later became Transformers as we know it today. Hugging Face has since shifted its business focus from providing chatbots to building NLP communities and model hubs.
The Transformers model hub provides thousands of pre-trained models, covering almost all the important models (except GPT-3). Users can make their own adjustments based on the Transformer models to perform their tasks.
💡Note:
Transformers is an umbrella term for different models based on Transformer. Models like BERT and GPT are all built on Transformer.
The Transformers model hub is a hub where the pre-trained models based on Transformer are aggregated.
The transformers model hub has attracted a large number of data scientists and developers from start-ups and big tech companies like Google, Microsoft, and Facebook, who will release and share their models and also refer to and use models shared by other community members.
Source:Hugging Face
In fact, 99% of the models on Hugging Face are contributed by people in the communities, where Hugging Face is more of a planner and operator.
Transformers models can be used to perform the following tasks:
Text: text classification, information extraction, Q&A, abstract, translation, and text generation, in more than 100 languages;
Image: image classification, image detection, and image segmentation;
Audio: speech recognition, and audio classification.
As models evolve rapidly and become increasingly complex, it is more expensive and stressful to manage and deploy models. Hugging Face shortens the development cycle by several days or weeks, and saves a handsome sum of money for developers and data scientists.
Today, over 50,000 users download models from Hugging Face each day on average. Within just a few years, Hugging Face has more than 1,440 contributors and 35,200 users, with 71,800 stars and 16,400 forks. Transformers has emerged as the best-known model hub in NLP/ML and is the fastest growing open-source project, while Hugging Face is one of the most prestigious AI/ML communities in history.
Source:Brandon Reeves, partner at Lux Capital
Team
Hugging Face has three co-founders: Clément Delangue (CEO), Julien Chaumond (CTO), and Thomas Wolf (CSO), all coming from France. At present, CEO Clément is based in New York, while the other founders in Paris, with most employees working in the US and France.
From the left to the right: Julien Chaumond(CTO)、Thomas Wolf(CSO)、Clément Delangue(CEO)
Source: Station F
When the company was founded, Clément was the only one graduating from a business school (ESCP Business School). The rest were all engineers. However, Clément studied computer knowledge and built up its coding capability through Stanford's Computer Science Program. Prior to the establishment of Hugging Face, Clément dealt with products and marketing in his work, with experience of starting up businesses independently. As a college student, Clément started a note tool company VideoNot as the CEO, and later became its product and market director. He joined two software start-up companies Moodstocks and Mention, which were taken over by Google and Mynewsdesk respectively.
Julien and Thomas have technical backgrounds. From 2013 to 2015, Julien served as the CTO of Glose, a reading platform founded in New York. Later, he came back to Paris where he worked as an engineer at Stupeflix, a video creation platform acquired by GoPro in 2016. Thomas studied law at Paris 1 Panthéon-Sorbonne university for 1 year after he had finished his PhD in quantum mechanics in 2011. He joined Plasseraud IP, a leading law firm with a history of over 100 years, providing legal, strategic, and technical consultation.
Today, Hugging Face has more than 100 employees, a large number of whom are technicians. It has more diverse employees coming from different backgrounds, and key members of the product, marketing, sales, and expert consultation teams have all onboarded.
03. Business Model
Hugging Face started its monetization journey in 2021, with the following attempts:
· AutoTrain
AutoTrain is AutoNLP. AutoTrain provides users with end-to-end NLP models that are automated and easy to use. Users only need to create tasks and upload data for AutoTrain to create, adjust, and evaluate models automatically, and find the optimal model for application.
The services provided by AutoTrain are similar to those of AutoNLP and AutoML, such as DataRobot and H20.ai. Adopting the pay-as-you-go method, AutoTrain charges users based on time and computing resource usage. According to users, AutoTrain is more capable and cost-efficient compared with other counterparts on the market.
· Inference API & Infinity
Hugging Face offers two paid inference products: Inference API and Infinity. Hugging Face launched Inference API first, which requires users to deploy data and models on Hugging Face's servers. The upside is that teams without internal infrastructure or without data scientists can still leverage ML solutions. Therefore, the target users are often SMB.
Since large enterprises, especially financial, energy and medical companies, cannot keep their data and models on third-party servers for safety and compliance concerns, Inference API is not there to help. As a result, Hugging Face developed another inference product for these clients, which was later known as Infinity. Infinity allows clients to deploy data and models on their own on-prem servers. Now some clients have already put Infinity into application. Big financial companies like JP Morgan are the potential clients of Infinity.
The inference products of Hugging Face rent CPU/GPU with the conventional pay-as-you-go solutions. The inference solutions available on the market are mostly expensive, and Hugging Face is no exception. Therefore, there is still room for Hugging Face to drive down the prices of its inference products.
· Private Hub
Just like GitHub which capitalizes on hosting private codes, Hugging Face charges users on a customized basis by hosting their models, datasets, and pipelines on Private Hub. Private Hub has imposed strict restrictions on user access to ensure data security
· Expert Support
Hugging Face has built a world-class internal team of ML experts to offer solutions to issues encountered by clients in the process of ML deployment and implementation. It uses a hybrid "seat-based + customized" pricing model for its expert support services. The prices will be subject to the company scale and the project scale & difficulty.
Among the above four business models, we believe that AutoTrain and Inference API & Infinity will be the key profit drivers for Hugging Face going forward, because AutoTrain and Inference are used very frequently with a high ARPU. In addition, the revenues generated from this model are closely linked to the volume of data used to train models and inference. As more and more data are entering the AI/ML workflow, AutoTrain and Inference are likely to achieve their exponential growth. The room for growth is huge.
Comparatively speaking, the glass ceilings of Private Hub and expert support are low. That said, Private Hub has unique strategic significance in the industry, while expert support might be able to convert more clients for the other three products and models.
As of June 2022, Hugging Face provides services for over 1,000 clients, including Intel, Qualcomm, Pfizer, and Bloomberg. It was revealed that Hugging Faced racked up $10 million in earnings in 2021, and was close to break-even in mid-2022.
In May 2022, Hugging Face, with an estimated market cap of $2 billion, secured $100 million in Series C funding, which was led by Lux Capital with major participants from Sequoia US and Coatue. Hugging Face landed $40 million in Series B funding, which is still kept in the bank, and its business and cash flow remain very healthy.
04. Market and Competition
Market
Hugging Face is currently focused on the NLP market and will gradually march into the ML sector. As per the estimate of Straits Research, the global NLP market was valued at $13.5 billion, and it is expected to reach $91 billion by 2030, with the CAGR being 27%. Meanwhile, the ML market is projected to be $209.9 billion in value by 2030.
The business model of Hugging Face suggests that its current market direction is AutoML + Private Hub + Large Language Model. Hugging Face might expand its business into MLOps. If it can make it, Hugging Face will grow to be a giant in the AI/ML sector. Even if not all its attempts succeed, each pathway still offers a huge market of their own.
Competition
The major competitors of Hugging Face include:
· OpenAI
People often compare Hugging Face with OpenAI. Different from OpenAI, which is closed-source with only a few models, Hugging Face offers thousands of open-source pre-trained models. As OpenAI builds its models on GPT-3 and DALL·E 2.0, users can only consume and use the given models and are not allowed to bifurcate and modify the models.
Since OpenAI only focuses on a few models, the models it trains are bigger, which means the results the models produce are more accurate. Yet, big models are too expensive. For instance, Microsoft and NVIDIA have developed the world's biggest natural language model MT-NLG (with 530 billion parameters). To train such a model, we need hundreds of GPU servers, each of which cost nearly $200,000, and internet devices and hosting services. If you add all these up, it costs tens of millions and even hundreds of millions to replicate the experiment, so it is about $4-10 million to train a model on GPT-3, an amount that very few users can afford. By providing open-source pre-trained models, Hugging Face lowers the thresholds and makes it possible for everyone to use ML models, democratizing AI/ML significantly.
In addition, Hugging Face introduced an open-source pre-trained Large Language Model (LLM) called BLOOM, whose architecture is similar to that of GPT-3. The launch of BLOOM intensifies the already fierce and direct competition between Hugging Face and OpenAI. Now that Hugging Face boasts so many pre-trained models and BLOOM, not only can it meet the demands of long-tail clients, but it can also cater to the needs of large, well-financed companies that purse great precision.
· Companies Specializing in AutoML
Given that AutoTrain is the major source of revenues for Hugging Face, companies specializing in AutoML, such as DataRobot and H2o.ai, are key competitors of Hugging Face.
Yet compared with DataRobot, Hugging Face is still in its infancy. The ARPU of Hugging Face is valued at $10,000, whereas the number for DataRobot is $50,000-$100,000, suggesting huge potential for Hugging Face to grow its ARPU.
· Cloud Service Providers
The three major cloud service providers will become the ultimate competitors of Hugging Face. AWS's ML platform SageMaker can provide full-stack solutions; Google is an industry leader of ML research and development; Azure is speeding up the building of its ML platform. Azure invested $1 billion in OpenAI in 2019 and gained the exclusive license of OpenAI's GPT-3, making it the sole cloud service provider of OpenAI.
As a matter of fact, Hugging Face also inked strategic partnership with SageMaker. For Hugging Face, the partnership with SageMaker can bring more users and use cases. Considering that most user data is stored on AWS, this partnership enables users to keep their data where they are now while training models on Hugging Face, without concerns about data insecurity caused by calling external APIs. For SageMaker, Hugging Face has the most state-of-art models that are still being updated, an advantage that SageMaker is unlikely to take over in the near term. Hugging Face can also deliver more workflows to AWS with its massive base of top ML users. In this sense, this can be a win-win partnership, though there is still strong competition between the two in the long run.
Having said that, one of Hugging Face's best advantages, compared with cloud service providers and OpenIA, is that Hugging Face has many "cloud platforms". This is also one of Snowflake's advantages when it was competing with other cloud service providers. To avoid over-dependence on a single cloud service provider and ensure business security and stability, many clients will deploy their business on the servers of different providers, so "having multiple cloud platforms" will be the next big thing.
05. Success Factors and Risks
Success Factors
· Promosing Market and Position
AI/ML will be one of the most important markets in the upcoming decade. Typical ML workflows include the following steps:
When users are working on ML, they first need to look for models that fit the target application scenarios and see if there are ready-to-use models or models that can be used with minor adjustment. For users not using pre-trained models, they can keep abreast of the latest developments in the industry and learn from other people's models.
People used to search for the desired models on Google or in papers. Now Hugging Face is their go-to place for finding the right models. Many models on Hugging Face can cater to specific needs with slight adjustments. Also, Hugging Face has a pool of the best ML experts and state-of-art models. It is hard for the majority of ML data scientists and engineers to circumvent or overlook Hugging Face.
In this sense, Hugging Face is present in the upstream of the whole ML workflow, serving as a necessary entrance for users to access AI/ML. Building on its advantageous strategic position, Hugging Face is working to monetize the downstream business.
Once large models or general AI become sophisticated enough, a variety of models and products that are developed based on the large models for different sectors and scenarios will emerge. These models might end up in the hands of different start-ups. But there is also another possibility: the models will all be gathered on Hugging Face, as there is no better place than Hugging Face to store and share models.
On top of its unique market position, Hugging Face is also well-positioned in the AI/ML ecosystem. “The companies you would assume are competitors on first blush—whether it’s Google or Amazon or Facebook—almost all of them are proponents,” says Lux Capital’s Brandon Reeves. “It really feels like this Switzerland-like piece of real estate in the ecosystem.”
· Fast Growing Communities Members with Great Loyalty
With Transformers as the fastest growing open-source project, Hugging Face is home to 35,200 users with great loyalty and activeness. Hugging Face has attracted more than 1,440 contributors, with 71,800 stars and 16,400 forks, and an average of over 50,000 users download models from Hugging Face each day.
Hugging Face has reached a plethora of users, including many from large enterprises. It is like Hugging Face having put its hands into the pockets of enterprises. The next step is to take the money out. In fact, Hugging Face has succeeded in so doing. It has achieved monetization from a handful of big-budget clients like Intel, Pfizer, and Bloomberg. Their technology, medicine and finance counterparts are well-financed and have a strong willingness to pay. The scale-up of B2B business requires a strong focus on expanding the client base from SMB to (large) enterprises. Now that Hugging Face is able to secure such quality enterprise clients at an early stage, the glass ceiling won't be too low.
· A Talented and Ambitious Founder
Clément, co-founder and CEO of Hugging Face, is commercially talented and capable. Born in a small town in Northern France, Clément started his own business when he was young, and became the top French merchant on eBay at 17.
His outstanding talent brought him the offers of many companies, including eBay and Google. However, he chose to start up his own business, rather than joining these big market players. As a college student, Clément also worked as the co-founder and CEO of VideoNot, a note tool company, and became its product and market director after he graduated. He joined two software start-ups Moodstocks and Mention, which were taken over by Google and Mynewsdesk respectively.
Clément has a clear idea of how to operate Hugging Face well and dares to make bold experiments. His goal is to land at least $1 billion in earnings, and his willingness and determination to launch the IPO is strong. He said that he had turned down several "meaningful acquisition offers" and would not sell his business, like GitHub did, to Microsoft. He hopes that Hugging Face can go public on Nasdaq with its iconic hugging face emoji, instead of a boring three-letter ticker.
Since Clément went to college, he has won a good reputation and many look to him as the example to follow and admire. This has helped attract many talented employees for Hugging Face. An AWS data scientist found that many outstanding colleagues with brilliant ideas around him had joined Hugging Face.
Risk
· Monetization, Will It Succeed or Fail?
Despite some initial results yielded by Hugging Face's monetization exploration, whether its monetization efforts will succeed at the end of the day remains our biggest concern.
From a statistical point of view, Hugging Face's monetization is still at a very early stage. Even with fast growing data, we still can't evaluate whether Hugging Face has sorted out or is able to sort out the way to generate revenues.
The majority of Hugging Face users are researchers, including enterprises' researchers, and professors and students of universities and research institutes. This means that Hugging Face has yet to be applied in real product environments. Besides, researchers have strong technical strengths, but they are not so willing to pay for commercial products.
According to our user research, Hugging Face's pricing model is somewhat unreasonable. The most important and valuable products are AutoTrain and Inference for Hugging Face. While the less frequently used AutoTrain is low-priced, the more frequently used Inference is too expensive, leading to users' perception that "Hugging Face is too costly to use". This will backfire on its monetization efforts as it will have a great bearing on the use rate and users' continued willingness to pay for the products.
Still, we are optimistic about Hugging Face's monetization efforts.
Hugging Face is home to many top ML practitioners and experts, many of whom come from big tech companies, financial institutions, and medical institutions. It has cultivated a huge group of quality and loyal users. Once ML achieves widespread adoption in production scenarios, Hugging Face is likely to become the center of the ML sector.
If Hugging Face does become the center of ML, making money will not be the most pressing thing. Instead, the top priority is to expand the scale of TAM to reach more people, teams and enterprises. Considering that the communities and monetization products of Hugging Face cover most parts of ML, if TAM is big enough, there will be more and more ML workflows finding their way to Hugging Face. And since ML workflows are complex, Hugging Face can charge users in more scenarios. To reach more users means a bigger user base for future conversion. As many Hugging Face users are from large companies, the possibilities of breaking into the enterprise market is huge.
Different from the founding teams of other open-source products and communities, the founding team of Hugging Face takes a proactive initiative to tap into ways of generating revenues. Through constant trials and adjustments, they aim to find the best way to materialize its monetization ambition. They have also been hiring marketing and sales professionals in the past year, in order to hone their monetization capabilities.
· Cut-throat Competition
Despite being a leader in building communities and model hubs, Hugging Face is still faced with fierce competition in the fields of AutoTrain, Inference, and MLOps, a market that Hugging Face is yearning to march into.
Since many big tech enterprises, including the three major cloud service providers, and major players of the data industry see ML as their next strategic focus, Hugging Face is sure to face great pressure. Yet, the ML battlefield dominated by big tech players will create both risks and opportunities - opportunities to be acquired by big tech companies - for Hugging Face as a start-up with a unique strategic position.
06. Conclusion
Downside: Being Taken Over
It is said that many tech companies had proposed acquisition offers to Hugging Face in January 2021, before Hugging Face completed his Series B financing. Some companies offered good prices for the take-over, but all offers were rejected by Clément, leading to Hugging Face's skyrocketing market valuation.
So far, these companies still hold their offers. Given their great interest in and passion for the take-over, we can assume that the worst case scenario for Hugging Face is to be acquired by a tech giant, and at a high price.
Upside: Being the Center of AI/ML
Being the Github for NLP, Hugging Face aspires to become the GitHub for ML. However, we believe that Hugging Face's growth potential is way beyond that. Thanks to its strategic position and expanding monetization scenarios, Hugging Face will be likely to penetrate into more parts of the ML workflow.
A glimpse into Hugging Face's business models and competitors shows that Hugging Face covers AutoML, model training and inference, model hubs, and expert support. In other words, Hugging Face = DataRobot + OpenAI + GitHub. If Hugging Face does succeed in penetrating into more parts of the ML workflow, it will be on a par with SageMaker, with greater potential for growth.
Below are the competitors and the data of comparable companies of Hugging Face's four major products. On the data front, Hugging Face is pivoting toward a $10-billion market, with a revenue of $500 million, which we think is a conservative prediction. Firstly, Inference will be the major source of revenues for Hugging Face, but the data of competitors under this model is not openly available. Therefore, the above data has yet to include the corresponding revenues and market value. Secondly, its competitors and comparable companies are still developing their business rapidly, which means that their future revenues and market valuation will continue to increase. Thirdly, the market cap of these companies is heavily affected by the macro-economic environment, leading to low market valuation of companies in the secondary market.
💡Note:
The business model of Private Hub is similar to that of GitHub and GitLab: Private Hub is about hosting models and dataset, whereas GitHub and GitLab are about hosting codes. Therefore, we see GitHub and GitLab as comparable companies.
There is no open inference business data of OpenAI and the three major cloud service providers.
Expert Support is not the major business of Hugging Face, so we do not include in the calculation and prediction.
Hugging Face provides easy solutions for users to access AI/ML, with its core business lying in the upstream of the ML workflow. If its attempts to monetize the downstream business take off, Hugging Face will be likely to penetrate into more parts of the ML workflow as the center of AI/ML. In fact, Hugging Face has been the center of AI/ML models in many fields, including image recognition of autonomous driving and recommendation systems of the pharmaceutical industry.
In addition to a huge market, the growth potential of Hugging Face in the next few years is there for all to see. According to Kaggle's research, the penetration rate of Hugging Face broke the 10% mark at the end of 2021. If the S-curve teaches us anything, it is that it will be a good time to start the investment when the penetration rate hits 10%, as this means that the company is still at its early stage but is about to take off.
According to Clément, “machine learning becomes the default way to build technology, and Hugging Face is the No.1 platform for this.” Brandon Reeves, a partner at Lux Capital, said, “if this vision can be delivered, Hugging Face is likely to become a $50 billion and even $100 billion company after going public.”
07. Appendix: Customer Interviewee Quotes
1. AI Researcher at Facebook
The first step for data scientists to work on ML is to find the pre-trained models they can refer to or use directly. People used to search for the desired models on Google or in papers. Now Hugging Face is their go-to place for finding the right models. Many models on Hugging Face can cater to specific needs with slight adjustments. If the existing models won't work, you can turn to AutoTrain which can train new models with your data.
2. ML Engineer at Adobe
Hugging Face is home to many top ML practitioners and experts. It is hard for the majority of ML data scientists and engineers to circumvent or overlook Hugging Face. If you circumvent Hugging Face, it is like you are abandoned by the best minds. Usually, before training models, we will see what others do and look for models that we can learn from and use.
3. ML Architect at Bloomberg
The advantage of Hugging Face's pre-trained models is that they are easy to use, fast and affordable. According to the end results, the difference between pre-trained models with a small number of parameters and large models is 0.1-0.3%, which is acceptable to many use scenarios. Compared with cloud service providers and OpenAI, one of Hugging Face's advantages is that it enables “multiple cloud servers”.
4. Senior ML Solution Architecture at Amazon
A lot of professionals at AWS, people that I look up to, joined Hugging Face. There are many Kagglers (Kaggle is the world's top and most prestigious data science contest). So I believe that Hugging Face has its unique charm and its staff are very talented. But their sales team is not so good as the technical team. They need to make more aggressive efforts to hire the best sales talents.











