TRIAL 1: Failure
Using Conda Prompt: conda install -c huggingface transformers Using YAML file: name: transformers channels: - conda-forge dependencies: - pip - pip: - transformers LOGS: (base) C:\Users\ash\Desktop>conda env create -f env.yml Collecting package metadata (repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.12.0 latest version: 4.13.0 Please update conda by running $ conda update -n base -c defaults conda Downloading and Extracting Packages libzlib-1.2.12 | 67 KB | #### | 100% setuptools-62.3.2 | 1.4 MB | #### | 100% xz-5.2.5 | 211 KB | #### | 100% libffi-3.4.2 | 41 KB | #### | 100% bzip2-1.0.8 | 149 KB | #### | 100% tzdata-2022a | 121 KB | #### | 100% ucrt-10.0.20348.0 | 1.2 MB | #### | 100% vc-14.2 | 13 KB | #### | 100% tk-8.6.12 | 3.5 MB | #### | 100% python_abi-3.10 | 4 KB | #### | 100% sqlite-3.38.5 | 1.3 MB | #### | 100% vs2015_runtime-14.29 | 1.3 MB | #### | 100% wheel-0.37.1 | 31 KB | #### | 100% openssl-3.0.3 | 10.0 MB | #### | 100% ca-certificates-2022 | 180 KB | #### | 100% python-3.10.4 | 16.2 MB | #### | 100% pip-22.1.1 | 1.5 MB | #### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done Installing pip dependencies: \ Ran pip subprocess with arguments: ['C:\\Users\\ash\\Anaconda3\\envs\\transformers\\python.exe', '-m', 'pip', 'install', '-U', '-r', 'C:\\Users\\ash\\Desktop\\condaenv.xzuashl6.requirements.txt'] Pip subprocess output: Collecting transformers Downloading transformers-4.19.2-py3-none-any.whl (4.2 MB) ---------------------------------------- 4.2/4.2 MB 2.6 MB/s eta 0:00:00 Collecting tqdm>=4.27 Downloading tqdm-4.64.0-py2.py3-none-any.whl (78 kB) ---------------------------------------- 78.4/78.4 kB 1.1 MB/s eta 0:00:00 Collecting pyyaml>=5.1 Downloading PyYAML-6.0-cp310-cp310-win_amd64.whl (151 kB) -------------------------------------- 151.7/151.7 kB 1.8 MB/s eta 0:00:00 Collecting regex!=2019.12.17 Downloading regex-2022.4.24-cp310-cp310-win_amd64.whl (262 kB) -------------------------------------- 262.0/262.0 kB 3.2 MB/s eta 0:00:00 Collecting requests Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB) ---------------------------------------- 63.1/63.1 kB 3.3 MB/s eta 0:00:00 Collecting numpy>=1.17 Downloading numpy-1.22.4-cp310-cp310-win_amd64.whl (14.7 MB) ---------------------------------------- 14.7/14.7 MB 2.9 MB/s eta 0:00:00 Collecting packaging>=20.0 Downloading packaging-21.3-py3-none-any.whl (40 kB) -------------------------------------- 40.8/40.8 kB 984.2 kB/s eta 0:00:00 Collecting tokenizers!=0.11.3,<0.13,>=0.11.1 Downloading tokenizers-0.12.1-cp310-cp310-win_amd64.whl (3.3 MB) ---------------------------------------- 3.3/3.3 MB 2.4 MB/s eta 0:00:00 Collecting filelock Downloading filelock-3.7.0-py3-none-any.whl (10 kB) Collecting huggingface-hub<1.0,>=0.1.0 Downloading huggingface_hub-0.7.0-py3-none-any.whl (86 kB) ---------------------------------------- 86.2/86.2 kB 1.2 MB/s eta 0:00:00 Collecting typing-extensions>=3.7.4.3 Downloading typing_extensions-4.2.0-py3-none-any.whl (24 kB) Collecting pyparsing!=3.0.5,>=2.0.2 Downloading pyparsing-3.0.9-py3-none-any.whl (98 kB) ---------------------------------------- 98.3/98.3 kB 1.9 MB/s eta 0:00:00 Collecting colorama Downloading colorama-0.4.4-py2.py3-none-any.whl (16 kB) Collecting certifi>=2017.4.17 Downloading certifi-2022.5.18.1-py3-none-any.whl (155 kB) -------------------------------------- 155.2/155.2 kB 2.3 MB/s eta 0:00:00 Collecting charset-normalizer~=2.0.0 Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB) Collecting urllib3<1.27,>=1.21.1 Downloading urllib3-1.26.9-py2.py3-none-any.whl (138 kB) -------------------------------------- 139.0/139.0 kB 2.1 MB/s eta 0:00:00 Collecting idna<4,>=2.5 Downloading idna-3.3-py3-none-any.whl (61 kB) ---------------------------------------- 61.2/61.2 kB 1.6 MB/s eta 0:00:00 Installing collected packages: tokenizers, urllib3, typing-extensions, regex, pyyaml, pyparsing, numpy, idna, filelock, colorama, charset-normalizer, certifi, tqdm, requests, packaging, huggingface-hub, transformers Successfully installed certifi-2022.5.18.1 charset-normalizer-2.0.12 colorama-0.4.4 filelock-3.7.0 huggingface-hub-0.7.0 idna-3.3 numpy-1.22.4 packaging-21.3 pyparsing-3.0.9 pyyaml-6.0 regex-2022.4.24 requests-2.27.1 tokenizers-0.12.1 tqdm-4.64.0 transformers-4.19.2 typing-extensions-4.2.0 urllib3-1.26.9 done # # To activate this environment, use # # $ conda activate transformers # # To deactivate an active environment, use # # $ conda deactivate (base) C:\Users\ash\Desktop> -------------------------------------------- (base) C:\Users\ash\Desktop>conda activate transformers (transformers) C:\Users\ash\Desktop>pip install ipykernel jupyter (transformers) C:\Users\ash\Desktop>python -m ipykernel install --user --name transformers Installed kernelspec transformers in C:\Users\ash\AppData\Roaming\jupyter\kernels\transformers -------------------------------------------- TESTING IN PYTHON: >>> import transformers as ppb None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. -------------------------------------- (transformers) C:\Users\ash>conda install -c conda-forge tensorflow Collecting package metadata (current_repodata.json): done Solving environment: failed with initial frozen solve. Retrying with flexible solve. Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source. Collecting package metadata (repodata.json): done Solving environment: failed with initial frozen solve. Retrying with flexible solve. Solving environment: - Found conflicts! Looking for incompatible packages. This can take several minutes. Press CTRL-C to abort.\ failed UnsatisfiableError: The following specifications were found to be incompatible with the existing python installation in your environment: Specifications: - tensorflow -> python[version='3.5.*|3.6.*|>=3.5,<3.6.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|3.8.*|3.7.*|3.9.*'] Your python: python=3.10 If python is on the left-most side of the chain, that's the version you've asked for. When python appears to the right, that indicates that the thing on the left is somehow not available for the python version you are constrained to. Note that conda will not change your python version to a different minor version unless you explicitly specify that. -------------------------------------TRIAL 2: Success
$ conda env remove -n transformers --all ENV.YML: name: transformers channels: - conda-forge dependencies: - python=3.9 - pip - pandas - pip: - transformers - tensorflow ALTERNATIVE (NOT TRIED) ENV.YML FILE: name: transformers channels: - conda-forge dependencies: - python=3.9 - pip - pandas - openpyxl - ipykernel - jupyter - tensorflow - pip: - transformers LOGS: (base) C:\Users\ash\Desktop>conda env create -f env.yml Collecting package metadata (repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.12.0 latest version: 4.13.0 Please update conda by running $ conda update -n base -c defaults conda Downloading and Extracting Packages setuptools-62.3.2 | 1.4 MB | #### | 100% python-3.9.13 | 17.9 MB | #### | 100% python_abi-3.9 | 4 KB | #### | 100% pandas-1.4.2 | 11.0 MB | #### | 100% numpy-1.22.4 | 6.1 MB | #### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done Installing pip dependencies: / Ran pip subprocess with arguments: ['C:\\Users\\ash\\Anaconda3\\envs\\transformers\\python.exe', '-m', 'pip', 'install', '-U', '-r', 'C:\\Users\\ash\\Desktop\\condaenv.m0blf3oh.requirements.txt'] Pip subprocess output: Collecting transformers Using cached transformers-4.19.2-py3-none-any.whl (4.2 MB) Collecting tensorflow Downloading tensorflow-2.9.1-cp39-cp39-win_amd64.whl (444.0 MB) -------------------------------------- 444.0/444.0 MB 1.7 MB/s eta 0:00:00 Collecting requests Using cached requests-2.27.1-py2.py3-none-any.whl (63 kB) Collecting regex!=2019.12.17 Downloading regex-2022.4.24-cp39-cp39-win_amd64.whl (262 kB) -------------------------------------- 262.1/262.1 kB 2.7 MB/s eta 0:00:00 Collecting tokenizers!=0.11.3,<0.13,>=0.11.1 Downloading tokenizers-0.12.1-cp39-cp39-win_amd64.whl (3.3 MB) ---------------------------------------- 3.3/3.3 MB 3.0 MB/s eta 0:00:00 Collecting filelock Using cached filelock-3.7.0-py3-none-any.whl (10 kB) Collecting pyyaml>=5.1 Downloading PyYAML-6.0-cp39-cp39-win_amd64.whl (151 kB) -------------------------------------- 151.6/151.6 kB 3.0 MB/s eta 0:00:00 Collecting tqdm>=4.27 Using cached tqdm-4.64.0-py2.py3-none-any.whl (78 kB) Collecting packaging>=20.0 Using cached packaging-21.3-py3-none-any.whl (40 kB) Collecting huggingface-hub<1.0,>=0.1.0 Using cached huggingface_hub-0.7.0-py3-none-any.whl (86 kB) Requirement already satisfied: numpy>=1.17 in c:\users\ash\anaconda3\envs\transformers\lib\site-packages (from transformers->-r C:\Users\ash\Desktop\condaenv.m0blf3oh.requirements.txt (line 1)) (1.22.4) Requirement already satisfied: six>=1.12.0 in c:\users\ash\anaconda3\envs\transformers\lib\site-packages (from tensorflow->-r C:\Users\ash\Desktop\condaenv.m0blf3oh.requirements.txt (line 2)) (1.16.0) Collecting termcolor>=1.1.0 Downloading termcolor-1.1.0.tar.gz (3.9 kB) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting tensorflow-io-gcs-filesystem>=0.23.1 Downloading tensorflow_io_gcs_filesystem-0.26.0-cp39-cp39-win_amd64.whl (1.5 MB) ---------------------------------------- 1.5/1.5 MB 3.0 MB/s eta 0:00:00 Collecting protobuf<3.20,>=3.9.2 Downloading protobuf-3.19.4-cp39-cp39-win_amd64.whl (895 kB) -------------------------------------- 895.7/895.7 kB 2.0 MB/s eta 0:00:00 Collecting absl-py>=1.0.0 Downloading absl_py-1.0.0-py3-none-any.whl (126 kB) -------------------------------------- 126.7/126.7 kB 1.1 MB/s eta 0:00:00 Collecting typing-extensions>=3.6.6 Using cached typing_extensions-4.2.0-py3-none-any.whl (24 kB) Collecting libclang>=13.0.0 Downloading libclang-14.0.1-py2.py3-none-win_amd64.whl (14.2 MB) -------------------------------------- 14.2/14.2 MB 701.7 kB/s eta 0:00:00 Collecting astunparse>=1.6.0 Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB) Collecting google-pasta>=0.1.1 Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB) ---------------------------------------- 57.5/57.5 kB 1.5 MB/s eta 0:00:00 Requirement already satisfied: setuptools in c:\users\ash\anaconda3\envs\transformers\lib\site-packages (from tensorflow->-r C:\Users\ash\Desktop\condaenv.m0blf3oh.requirements.txt (line 2)) (62.3.2) Collecting tensorflow-estimator<2.10.0,>=2.9.0rc0 Downloading tensorflow_estimator-2.9.0-py2.py3-none-any.whl (438 kB) -------------------------------------- 438.7/438.7 kB 2.7 MB/s eta 0:00:00 Collecting tensorboard<2.10,>=2.9 Downloading tensorboard-2.9.0-py3-none-any.whl (5.8 MB) ---------------------------------------- 5.8/5.8 MB 2.9 MB/s eta 0:00:00 Collecting opt-einsum>=2.3.2 Downloading opt_einsum-3.3.0-py3-none-any.whl (65 kB) ---------------------------------------- 65.5/65.5 kB 1.2 MB/s eta 0:00:00 Collecting gast<=0.4.0,>=0.2.1 Downloading gast-0.4.0-py3-none-any.whl (9.8 kB) Collecting wrapt>=1.11.0 Downloading wrapt-1.14.1-cp39-cp39-win_amd64.whl (35 kB) Collecting grpcio<2.0,>=1.24.3 Downloading grpcio-1.46.3-cp39-cp39-win_amd64.whl (3.5 MB) ---------------------------------------- 3.5/3.5 MB 2.7 MB/s eta 0:00:00 Collecting keras-preprocessing>=1.1.1 Downloading Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB) ---------------------------------------- 42.6/42.6 kB 1.0 MB/s eta 0:00:00 Collecting h5py>=2.9.0 Downloading h5py-3.7.0-cp39-cp39-win_amd64.whl (2.6 MB) ---------------------------------------- 2.6/2.6 MB 2.8 MB/s eta 0:00:00 Collecting flatbuffers<2,>=1.12 Downloading flatbuffers-1.12-py2.py3-none-any.whl (15 kB) Collecting keras<2.10.0,>=2.9.0rc0 Downloading keras-2.9.0-py2.py3-none-any.whl (1.6 MB) ---------------------------------------- 1.6/1.6 MB 2.7 MB/s eta 0:00:00 Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\users\ash\anaconda3\envs\transformers\lib\site-packages (from astunparse>=1.6.0->tensorflow->-r C:\Users\ash\Desktop\condaenv.m0blf3oh.requirements.txt (line 2)) (0.37.1) Collecting pyparsing!=3.0.5,>=2.0.2 Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB) Collecting google-auth<3,>=1.6.3 Downloading google_auth-2.6.6-py2.py3-none-any.whl (156 kB) -------------------------------------- 156.7/156.7 kB 2.4 MB/s eta 0:00:00 Collecting tensorboard-plugin-wit>=1.6.0 Downloading tensorboard_plugin_wit-1.8.1-py3-none-any.whl (781 kB) -------------------------------------- 781.3/781.3 kB 3.3 MB/s eta 0:00:00 Collecting markdown>=2.6.8 Downloading Markdown-3.3.7-py3-none-any.whl (97 kB) ---------------------------------------- 97.8/97.8 kB 1.4 MB/s eta 0:00:00 Collecting tensorboard-data-server<0.7.0,>=0.6.0 Downloading tensorboard_data_server-0.6.1-py3-none-any.whl (2.4 kB) Collecting werkzeug>=1.0.1 Downloading Werkzeug-2.1.2-py3-none-any.whl (224 kB) -------------------------------------- 224.9/224.9 kB 2.3 MB/s eta 0:00:00 Collecting google-auth-oauthlib<0.5,>=0.4.1 Downloading google_auth_oauthlib-0.4.6-py2.py3-none-any.whl (18 kB) Collecting idna<4,>=2.5 Using cached idna-3.3-py3-none-any.whl (61 kB) Collecting certifi>=2017.4.17 Using cached certifi-2022.5.18.1-py3-none-any.whl (155 kB) Collecting urllib3<1.27,>=1.21.1 Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB) Collecting charset-normalizer~=2.0.0 Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB) Collecting colorama Using cached colorama-0.4.4-py2.py3-none-any.whl (16 kB) Collecting pyasn1-modules>=0.2.1 Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB) -------------------------------------- 155.3/155.3 kB 2.3 MB/s eta 0:00:00 Collecting cachetools<6.0,>=2.0.0 Downloading cachetools-5.2.0-py3-none-any.whl (9.3 kB) Collecting rsa<5,>=3.1.4 Downloading rsa-4.8-py3-none-any.whl (39 kB) Collecting requests-oauthlib>=0.7.0 Downloading requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB) Collecting importlib-metadata>=4.4 Downloading importlib_metadata-4.11.4-py3-none-any.whl (18 kB) Collecting zipp>=0.5 Downloading zipp-3.8.0-py3-none-any.whl (5.4 kB) Collecting pyasn1<0.5.0,>=0.4.6 Downloading pyasn1-0.4.8-py2.py3-none-any.whl (77 kB) ---------------------------------------- 77.1/77.1 kB 2.2 MB/s eta 0:00:00 Collecting oauthlib>=3.0.0 Downloading oauthlib-3.2.0-py3-none-any.whl (151 kB) -------------------------------------- 151.5/151.5 kB 3.0 MB/s eta 0:00:00 Building wheels for collected packages: termcolor Building wheel for termcolor (setup.py): started Building wheel for termcolor (setup.py): finished with status 'done' Created wheel for termcolor: filename=termcolor-1.1.0-py3-none-any.whl size=4832 sha256=34e6470d92e16cedf1b846cf239d01ce6c05ddff3b0ec5437ceff54ea7de2d15 Stored in directory: c:\users\ash\appdata\local\pip\cache\wheels\b6\0d\90\0d1bbd99855f99cb2f6c2e5ff96f8023fad8ec367695f7d72d Successfully built termcolor Installing collected packages: tokenizers, termcolor, tensorboard-plugin-wit, pyasn1, libclang, keras, flatbuffers, zipp, wrapt, werkzeug, urllib3, typing-extensions, tensorflow-io-gcs-filesystem, tensorflow-estimator, tensorboard-data-server, rsa, regex, pyyaml, pyparsing, pyasn1-modules, protobuf, opt-einsum, oauthlib, keras-preprocessing, idna, h5py, grpcio, google-pasta, gast, filelock, colorama, charset-normalizer, certifi, cachetools, astunparse, absl-py, tqdm, requests, packaging, importlib-metadata, google-auth, requests-oauthlib, markdown, huggingface-hub, transformers, google-auth-oauthlib, tensorboard, tensorflow Successfully installed absl-py-1.0.0 astunparse-1.6.3 cachetools-5.2.0 certifi-2022.5.18.1 charset-normalizer-2.0.12 colorama-0.4.4 filelock-3.7.0 flatbuffers-1.12 gast-0.4.0 google-auth-2.6.6 google-auth-oauthlib-0.4.6 google-pasta-0.2.0 grpcio-1.46.3 h5py-3.7.0 huggingface-hub-0.7.0 idna-3.3 importlib-metadata-4.11.4 keras-2.9.0 keras-preprocessing-1.1.2 libclang-14.0.1 markdown-3.3.7 oauthlib-3.2.0 opt-einsum-3.3.0 packaging-21.3 protobuf-3.19.4 pyasn1-0.4.8 pyasn1-modules-0.2.8 pyparsing-3.0.9 pyyaml-6.0 regex-2022.4.24 requests-2.27.1 requests-oauthlib-1.3.1 rsa-4.8 tensorboard-2.9.0 tensorboard-data-server-0.6.1 tensorboard-plugin-wit-1.8.1 tensorflow-2.9.1 tensorflow-estimator-2.9.0 tensorflow-io-gcs-filesystem-0.26.0 termcolor-1.1.0 tokenizers-0.12.1 tqdm-4.64.0 transformers-4.19.2 typing-extensions-4.2.0 urllib3-1.26.9 werkzeug-2.1.2 wrapt-1.14.1 zipp-3.8.0 done # # To activate this environment, use # # $ conda activate transformers # # To deactivate an active environment, use # # $ conda deactivate (base) C:\Users\ash\Desktop>conda activate transformers (transformers) C:\Users\ash\Desktop>conda install -c conda-forge jupyter ipykernel (transformers) C:\Users\ash\Desktop>python -m ipykernel install --user --name transformers Installed kernelspec transformers in C:\Users\ash\AppData\Roaming\jupyter\kernels\transformersTESTING LOGS
import warnings warnings.filterwarnings('ignore') print(ppb.__version__) # 4.19.2 model_class, tokenizer_class, pretrained_weights = (ppb.BertModel, ppb.BertTokenizer, 'bert-base-uncased') tokenizer = tokenizer_class.from_pretrained(pretrained_weights) model = model_class.from_pretrained(pretrained_weights) OUTPUT: Downloading: 100% 226k/226k [00:01<00:00, 253kB/s] Downloading: 100% 28.0/28.0 [00:00<00:00, 921B/s] Downloading: 100% 570/570 [00:00<00:00, 14.5kB/s] --------------------------------------------------------------------------- ImportError Traceback (most recent call last) Input In [9], in <cell line: 2>() 1 tokenizer = tokenizer_class.from_pretrained(pretrained_weights) ----> 2 model = model_class.from_pretrained(pretrained_weights) File ~\Anaconda3\envs\transformers\lib\site-packages\transformers\utils\import_utils.py:788, in DummyObject.__getattr__(cls, key) 786 if key.startswith("_"): 787 return super().__getattr__(cls, key) --> 788 requires_backends(cls, cls._backends) File ~\Anaconda3\envs\transformers\lib\site-packages\transformers\utils\import_utils.py:776, in requires_backends(obj, backends) 774 failed = [msg.format(name) for available, msg in checks if not available()] 775 if failed: --> 776 raise ImportError("".join(failed)) ImportError: BertModel requires the PyTorch library but it was not found in your environment. Checkout the instructions on the installation page: https://pytorch.org/get-started/locally/ and follow the ones that match your environment. FIX: (transformers) C:\Users\ash>conda install -c pytorch pytorch Collecting package metadata (current_repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.12.0 latest version: 4.13.0 Please update conda by running $ conda update -n base -c defaults conda ## Package Plan ## environment location: C:\Users\ash\Anaconda3\envs\transformers added / updated specs: - pytorch The following packages will be downloaded: package | build ---------------------------|----------------- cudatoolkit-11.3.1 | h59b6b97_2 545.3 MB libuv-1.40.0 | he774522_0 255 KB openssl-1.1.1o | h2bbff1b_0 4.8 MB pytorch-1.11.0 |py3.9_cuda11.3_cudnn8_0 1.23 GB pytorch pytorch-mutex-1.0 | cuda 3 KB pytorch ------------------------------------------------------------ Total: 1.77 GB The following NEW packages will be INSTALLED: blas pkgs/main/win-64::blas-1.0-mkl cudatoolkit pkgs/main/win-64::cudatoolkit-11.3.1-h59b6b97_2 libuv pkgs/main/win-64::libuv-1.40.0-he774522_0 pytorch pytorch/win-64::pytorch-1.11.0-py3.9_cuda11.3_cudnn8_0 pytorch-mutex pytorch/noarch::pytorch-mutex-1.0-cuda typing_extensions pkgs/main/noarch::typing_extensions-4.1.1-pyh06a4308_0 The following packages will be SUPERSEDED by a higher-priority channel: openssl conda-forge::openssl-1.1.1o-h8ffe710_0 --> pkgs/main::openssl-1.1.1o-h2bbff1b_0 Proceed ([y]/n)? y Downloading and Extracting Packages libuv-1.40.0 | 255 KB | #### | 100% openssl-1.1.1o | 4.8 MB | #### | 100% pytorch-mutex-1.0 | 3 KB | #### | 100% cudatoolkit-11.3.1 | 545.3 MB | #### | 100% pytorch-1.11.0 | 1.23 GB | #### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done (transformers) C:\Users\ash>
Monday, May 30, 2022
Installing Python Package 'transformers' for BERT
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment