Yahoo Search Busca da Web

Resultado da Busca

  1. Browse and explore hundreds of open source AI models for text generation, text-to-image, speech recognition, and more. Find the latest updates, trends, and features of each model on Hugging Face.

  2. Hugging Face is a platform where the machine learning community collaborates on models, datasets, and applications. Browse and host over 400k models, 150k applications, and 100k datasets for text, image, video, audio, and 3D modalities.

  3. Find, upload, and use models for various NLP and computer vision problems on the Hugging Face Hub. Learn how to explore, download, and integrate models with the Hub's features and tools.

    • Overview
    • Online demos
    • 100 projects using Transformers
    • Quick tour
    • Why should I use transformers?
    • Why shouldn't I use transformers?
    • Installation
    • Model architectures
    • Citation
    • GeneratedCaptionsTabForHeroSec

    •📝 Text, for tasks like text classification, information extraction, question answering, summarization, translation, and text generation, in over 100 languages.

    •🖼️ Images, for tasks like image classification, object detection, and segmentation.

    •🗣️ Audio, for tasks like speech recognition and audio classification.

    Transformer models can also perform tasks on several modalities combined, such as table question answering, optical character recognition, information extraction from scanned documents, video classification, and visual question answering.

    🤗 Transformers provides APIs to quickly download and use those pretrained models on a given text, fine-tune them on your own datasets and then share them with the community on our model hub. At the same time, each python module defining an architecture is fully standalone and can be modified to enable quick research experiments.

    🤗 Transformers is backed by the three most popular deep learning libraries — Jax, PyTorch and TensorFlow — with a seamless integration between them. It's straightforward to train your models with one before loading them for inference with the other.

    Transformers is more than a toolkit to use pretrained models: it's a community of projects built around it and the Hugging Face Hub. We want Transformers to enable developers, researchers, students, professors, engineers, and anyone else to build their dream projects.

    In order to celebrate the 100,000 stars of transformers, we have decided to put the spotlight on the community, and we have created the awesome-transformers page which lists 100 incredible projects built in the vicinity of transformers.

    To immediately use a model on a given input (text, image, audio, ...), we provide the pipeline API. Pipelines group together a pretrained model with the preprocessing that was used during that model's training. Here is how to quickly use a pipeline to classify positive versus negative texts:

    The second line of code downloads and caches the pretrained model used by the pipeline, while the third evaluates it on the given text. Here, the answer is "positive" with a confidence of 99.97%.

    Many tasks have a pre-trained pipeline ready to go, in NLP but also in computer vision and speech. For example, we can easily extract detected objects in an image:

    Here, we get a list of objects detected in the image, with a box surrounding the object and a confidence score. Here is the original image on the left, with the predictions displayed on the right:

    1.Easy-to-use state-of-the-art models:

    •High performance on natural language understanding & generation, computer vision, and audio tasks.

    •Low barrier to entry for educators and practitioners.

    •Few user-facing abstractions with just three classes to learn.

    •A unified API for using all our pretrained models.

    2.Lower compute costs, smaller carbon footprint:

    •This library is not a modular toolbox of building blocks for neural nets. The code in the model files is not refactored with additional abstractions on purpose, so that researchers can quickly iterate on each of the models without diving into additional abstractions/files.

    •The training API is not intended to work on any model but is optimized to work with the models provided by the library. For generic machine learning loops, you should use another library (possibly, Accelerate).

    With pip

    This repository is tested on Python 3.8+, Flax 0.4.1+, PyTorch 1.11+, and TensorFlow 2.6+. You should install 🤗 Transformers in a virtual environment. If you're unfamiliar with Python virtual environments, check out the user guide. First, create a virtual environment with the version of Python you're going to use and activate it. Then, you will need to install at least one of Flax, PyTorch, or TensorFlow. Please refer to TensorFlow installation page, PyTorch installation page and/or Flax and Jax installation pages regarding the specific installation command for your platform. When one of those backends has been installed, 🤗 Transformers can be installed using pip as follows: If you'd like to play with the examples or need the bleeding edge of the code and can't wait for a new release, you must install the library from source.

    With conda

    🤗 Transformers can be installed using conda as follows: Follow the installation pages of Flax, PyTorch or TensorFlow to see how to install them with conda.

    All the model checkpoints provided by 🤗 Transformers are seamlessly integrated from the huggingface.co model hub, where they are uploaded directly by users and organizations.

    Current number of checkpoints:

    🤗 Transformers currently provides the following architectures (see here for a high-level summary of each them):

    1.ALBERT (from Google Research and the Toyota Technological Institute at Chicago) released with the paper ALBERT: A Lite BERT for Self-supervised Learning of Language Representations, by Zhenzhong Lan, Mingda Chen, Sebastian Goodman, Kevin Gimpel, Piyush Sharma, Radu Soricut.

    2.ALIGN (from Google Research) released with the paper Scaling Up Visual and Vision-Language Representation Learning With Noisy Text Supervision by Chao Jia, Yinfei Yang, Ye Xia, Yi-Ting Chen, Zarana Parekh, Hieu Pham, Quoc V. Le, Yunhsuan Sung, Zhen Li, Tom Duerig.

    3.AltCLIP (from BAAI) released with the paper AltCLIP: Altering the Language Encoder in CLIP for Extended Language Capabilities by Chen, Zhongzhi and Liu, Guang and Zhang, Bo-Wen and Ye, Fulong and Yang, Qinghong and Wu, Ledell.

    We now have a paper you can cite for the 🤗 Transformers library:

    Transformers is a toolkit for state-of-the-art machine learning on different modalities, backed by Jax, PyTorch and TensorFlow. It provides thousands of pretrained models, APIs to download, fine-tune and share them, and online demos for various tasks.

  4. www.hugging-face.org › hugging-face-model-hubHugging Face Model Hub

    17 de nov. de 2023 · Learn how to find and download various machine learning models, especially in natural language processing, from the Hugging Face Model Hub. See how to navigate the platform, view the model files, and use the tags to identify the model types.

  5. Browse the latest and most popular models for text generation, text-to-image, visual question answering, and more on Hugging Face. Find models by meta-llama, microsoft, apple, stabilityai, and other leading researchers and organizations.

  6. 10 de jan. de 2024 · Hugging Face is a platform that offers thousands of AI models, datasets, and demo apps for NLP, computer vision, audio, and multimodal tasks. Learn how to create an account, set up your environment, and use pre-trained models on Hugging Face.