Friday, September 23, 2022

Baseline Model For Bot Detection on Twitter Using VADER and RandomForestClassifier

Download Code

import pandas as pd
import numpy as np
import re
import seaborn as sns
import nltk
from nltk.sentiment.vader import SentimentIntensityAnalyzer

from sklearn.ensemble import RandomForestClassifier
from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report

nltk.download('vader_lexicon')



[nltk_data] Downloading package vader_lexicon to
[nltk_data]     /home/ashish/nltk_data...
[nltk_data]   Package vader_lexicon is already up-to-date!

True


df = pd.read_csv('input/tweets_of_f234_users_1663839312.csv')

userid_accountype = df[['userid', 'account_type']].drop_duplicates() userid_accountype['account_type'].value_counts() human 146 bot 71 Name: account_type, dtype: int64 %%time # Wall time: 1min 41s pred = [] vader_label = [] sid = SentimentIntensityAnalyzer() for sentence in df['clean_tweet'].values: ss = sid.polarity_scores(sentence) pred.append(ss['compound']) if(ss['compound'] < 0.05 and ss['compound'] > -0.05): vader_label.append('neutral') elif(ss['compound'] >= 0.05): vader_label.append('positive') elif(ss['compound'] <= -0.05): vader_label.append('negative') CPU times: user 2min 40s, sys: 2 s, total: 2min 42s Wall time: 2min 42s df['vader_sentiment'] = pred df['vader_label'] = vader_label
df_mean_sentiment = df.groupby(['userid', 'vader_label']).mean().reset_index() df_mean_sentiment.rename({'vader_sentiment': 'mean_sentiment'}, axis='columns', inplace = True) df_mean_sentiment
df_var_sentiment = df.groupby(['userid']).var().reset_index() df_var_sentiment.rename({'vader_sentiment': 'variance_sentiment'}, axis='columns', inplace = True) df_var_sentiment
df_mean_var = df_mean_sentiment.merge(df_var_sentiment, on=['userid'], how = 'inner') df_mean_var
df_mean_var_w_label = df_mean_var.merge(userid_accountype, on = 'userid', how = 'inner') df_mean_var_w_label
df_mean_var_w_label['account_type'].value_counts() human 435 bot 205 Name: account_type, dtype: int64 sns.scatterplot(data = df_mean_var_w_label, x = "mean_sentiment", y = "variance_sentiment", hue = "account_type", style = "account_type")
sns.scatterplot(data = df_mean_var_w_label, x = "account_type", y = "mean_sentiment", hue = "account_type", style = "account_type")
sns.scatterplot(data = df_mean_var_w_label, x = "account_type", y = "variance_sentiment", hue = "account_type", style = "account_type")
def get_url_flag(in_tweet): m = re.search(r"http[a-zA-Z0-9/\-.:%]+", in_tweet) rtn = False if m: rtn = True return rtn df['url_flag'] = df['clean_tweet'].apply(get_url_flag)
df.groupby(['account_type'])['url_flag'].value_counts() account_type url_flag bot True 68899 False 47947 human False 202148 True 169643 Name: url_flag, dtype: int64 print(68899 / (68899+47947)) print(169643 / (169643 + 202148)) 0.5896564709104292 0.4562859240810025 df['len'] = df['clean_tweet'].apply(len) df.groupby(['account_type'])['len'].mean() account_type bot 111.761455 human 109.129188 Name: len, dtype: float64
clf = RandomForestClassifier(random_state=0) X = df[['vader_sentiment', 'url_flag', 'len']] y = df['account_type'] X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.33, random_state=42) clf = clf.fit(X_train, y_train) pred = clf.predict(X_test) labels = ['bot', 'human'] print(classification_report(y_test, y_pred = pred, labels = labels)) precision recall f1-score support bot 0.54 0.24 0.33 38217 human 0.80 0.93 0.86 123034 accuracy 0.77 161251 macro avg 0.67 0.59 0.60 161251 weighted avg 0.74 0.77 0.74 161251

Accuracy of: 0.77

Tags: Technology,Natural Language Processing,

Thursday, September 22, 2022

Technology Listing Related to Ubuntu Software House (Sep 2022)

1. GNU Image Manipulation Program (GIMP)

2. LibreOffice Suite
3. LibreOffice Software Listing
4. Mozilla Firefox
5. PyCharm IDE (Professional Edition)
6. qBitTorrent (An open source Torrent client)
7. Tor Browser For bypassing network firewall of a private network and bypassing restricted browsing setting of your ISP (Internet Service Provider).
8. VLC Media Player
9. Jami: Video Conferencing Application All Platforms What does Jami mean? Ans: The choice of the name Jami was inspired by the Swahili word 'jami', which means 'community' as a noun and 'together' as an adverd.
Tags: Technology,Linux,

Sentiment Analysis using BERT, DistilBERT and ALBERT (Installation)

We will do Sentiment Analysis using the code from this repo: GitHub
Note: The entire GitHub code base for this project is about 18 MB in size.
And for the first time, when you run the "server.py" from Anaconda Prompt, it downloads the BERT model of size about 450 MB.

Contents of YAML file for conda environment creation: env.yml

name: barissayil channels: - defaults - conda-forge - pytorch dependencies: - python==3.9 - pip - pip: - transformers==4.15.0 - pytorch - pandas - numpy - flask - flask_cors - scikit-learn

Running the command in Ubuntu Terminal

(base) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ conda env create -f env.yml Collecting package metadata (repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.12.0 latest version: 4.14.0 Please update conda by running $ conda update -n base -c defaults conda Downloading and Extracting Packages numpy-base-1.23.1 | 5.6 MB | ### | 100% pytz-2022.1 | 194 KB | ### | 100% tzdata-2022c | 107 KB | ### | 100% numexpr-2.8.3 | 124 KB | ### | 100% ninja-1.10.2 | 8 KB | ### | 100% flask_cors-3.0.10 | 16 KB | ### | 100% certifi-2022.9.14 | 155 KB | ### | 100% libgcc-ng-11.2.0 | 5.3 MB | ### | 100% scipy-1.7.1 | 16.9 MB | ### | 100% setuptools-63.4.1 | 1.1 MB | ### | 100% libgomp-11.2.0 | 474 KB | ### | 100% numpy-1.23.1 | 11 KB | ### | 100% pip-22.1.2 | 2.5 MB | ### | 100% flask-2.1.3 | 130 KB | ### | 100% pandas-1.4.4 | 9.8 MB | ### | 100% bottleneck-1.3.5 | 115 KB | ########### | 100% scikit-learn-1.1.1 | 6.1 MB | ########### | 100% ninja-base-1.10.2 | 109 KB | ########### | 100% python-3.9.0 | 18.1 MB | ########### | 100% pyparsing-3.0.9 | 151 KB | ########### | 100% typing_extensions-4. | 42 KB | ########### | 100% typing-extensions-4. | 9 KB | ########### | 100% _openmp_mutex-5.1 | 21 KB | ########### | 100% zipp-3.8.0 | 15 KB | ########### | 100% pytorch-1.10.2 | 44.1 MB | ########### | 100% markupsafe-2.1.1 | 21 KB | ########### | 100% cffi-1.15.1 | 228 KB | ########### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done Installed package of scikit-learn can be accelerated using scikit-learn-intelex. More details are available here: https://intel.github.io/scikit-learn-intelex For example: $ conda install scikit-learn-intelex $ python -m sklearnex my_application.py Installing pip dependencies: - Ran pip subprocess with arguments: ['/home/ashish/anaconda3/envs/barissayil/bin/python', '-m', 'pip', 'install', '-U', '-r', '/home/ashish/Desktop/condaenv.6jfaxui9.requirements.txt'] Pip subprocess output: Collecting transformers==4.15.0 Downloading transformers-4.15.0-py3-none-any.whl (3.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.4/3.4 MB 194.6 kB/s eta 0:00:00 Requirement already satisfied: numpy>=1.17 in /home/ashish/anaconda3/envs/barissayil/lib/python3.9/site-packages (from transformers==4.15.0->-r /home/ashish/Desktop/condaenv.6jfaxui9.requirements.txt (line 1)) (1.23.1) Collecting regex!=2019.12.17 Downloading regex-2022.9.13-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (769 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 770.0/770.0 kB 169.9 kB/s eta 0:00:00 Collecting requests Downloading requests-2.28.1-py3-none-any.whl (62 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.8/62.8 kB 178.2 kB/s eta 0:00:00 Collecting huggingface-hub<1.0,>=0.1.0 Downloading huggingface_hub-0.9.1-py3-none-any.whl (120 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 120.7/120.7 kB 128.6 kB/s eta 0:00:00 Collecting pyyaml>=5.1 Downloading PyYAML-6.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (661 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 661.8/661.8 kB 192.7 kB/s eta 0:00:00 Requirement already satisfied: packaging>=20.0 in /home/ashish/anaconda3/envs/barissayil/lib/python3.9/site-packages (from transformers==4.15.0->-r /home/ashish/Desktop/condaenv.6jfaxui9.requirements.txt (line 1)) (21.3) Collecting filelock Downloading filelock-3.8.0-py3-none-any.whl (10 kB) Downloading tqdm-4.64.1-py2.py3-none-any.whl (78 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 78.5/78.5 kB 168.1 kB/s eta 0:00:00 Collecting sacremoses Downloading sacremoses-0.0.53.tar.gz (880 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 880.6/880.6 kB 144.4 kB/s eta 0:00:00 Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting tokenizers<0.11,>=0.10.1 Downloading tokenizers-0.10.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (3.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.3/3.3 MB 162.9 kB/s eta 0:00:00 Requirement already satisfied: typing-extensions>=3.7.4.3 in /home/ashish/anaconda3/envs/barissayil/lib/python3.9/site-packages (from huggingface-hub<1.0,>=0.1.0->transformers==4.15.0->-r /home/ashish/Desktop/condaenv.6jfaxui9.requirements.txt (line 1)) (4.3.0) Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /home/ashish/anaconda3/envs/barissayil/lib/python3.9/site-packages (from packaging>=20.0->transformers==4.15.0->-r /home/ashish/Desktop/condaenv.6jfaxui9.requirements.txt (line 1)) (3.0.9) Collecting idna<4,>=2.5 Downloading idna-3.4-py3-none-any.whl (61 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.5/61.5 kB 139.1 kB/s eta 0:00:00 Collecting charset-normalizer<3,>=2 Downloading charset_normalizer-2.1.1-py3-none-any.whl (39 kB) Requirement already satisfied: certifi>=2017.4.17 in /home/ashish/anaconda3/envs/barissayil/lib/python3.9/site-packages (from requests->transformers==4.15.0->-r /home/ashish/Desktop/condaenv.6jfaxui9.requirements.txt (line 1)) (2022.9.14) Collecting urllib3<1.27,>=1.21.1 Downloading urllib3-1.26.12-py2.py3-none-any.whl (140 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 140.4/140.4 kB 188.4 kB/s eta 0:00:00 Requirement already satisfied: six in /home/ashish/anaconda3/envs/barissayil/lib/python3.9/site-packages (from sacremoses->transformers==4.15.0->-r /home/ashish/Desktop/condaenv.6jfaxui9.requirements.txt (line 1)) (1.16.0) Requirement already satisfied: click in /home/ashish/anaconda3/envs/barissayil/lib/python3.9/site-packages (from sacremoses->transformers==4.15.0->-r /home/ashish/Desktop/condaenv.6jfaxui9.requirements.txt (line 1)) (8.0.4) Requirement already satisfied: joblib in /home/ashish/anaconda3/envs/barissayil/lib/python3.9/site-packages (from sacremoses->transformers==4.15.0->-r /home/ashish/Desktop/condaenv.6jfaxui9.requirements.txt (line 1)) (1.1.0) Building wheels for collected packages: sacremoses Building wheel for sacremoses (setup.py): started Building wheel for sacremoses (setup.py): finished with status 'done' Created wheel for sacremoses: filename=sacremoses-0.0.53-py3-none-any.whl size=895241 sha256=3aa00715128a0c0de964dd1229c0cd2704c6ddb45ef5407c4bb3e5d273808164 Stored in directory: /home/ashish/.cache/pip/wheels/12/1c/3d/46cf06718d63a32ff798a89594b61e7f345ab6b36d909ce033 Successfully built sacremoses Installing collected packages: tokenizers, urllib3, tqdm, regex, pyyaml, idna, filelock, charset-normalizer, sacremoses, requests, huggingface-hub, transformers Successfully installed charset-normalizer-2.1.1 filelock-3.8.0 huggingface-hub-0.9.1 idna-3.4 pyyaml-6.0 regex-2022.9.13 requests-2.28.1 sacremoses-0.0.53 tokenizers-0.10.3 tqdm-4.64.1 transformers-4.15.0 urllib3-1.26.12 # # To activate this environment, use # # $ conda activate barissayil # # To deactivate an active environment, use # # $ conda deactivate
Tags: Technology,BERT,Natural Language Processing,

Tuesday, September 20, 2022

Using Twitter API to fetch trending topics, tweets and users posting them

Download Code

Twitter API Documentation Snippets

Dated: 2022-Sep-20

GET trends/place

Returns the top 50 trending topics for a specific id, if trending information is available for it. Note: The id parameter for this endpoint is the "where on earth identifier" or WOEID, which is a legacy identifier created by Yahoo and has been deprecated. Twitter API v1.1 still uses the numeric value to identify town and country trend locations. Reference our legacy blog post, or archived data Example WOEID locations include: Worldwide: 1 UK: 23424975 Brazil: 23424768 Germany: 23424829 Mexico: 23424900 Canada: 23424775 United States: 23424977 New York: 2459115 To identify other ids, please use the GET trends/available endpoint. The response is an array of trend objects that encode the name of the trending topic, the query parameter that can be used to search for the topic on Twitter Search, and the Twitter Search URL. The most up to date info available is returned on request. The created_at field will show when the oldest trend started trending. The as_of field contains the timestamp when the list of trends was created. The tweet_volume for the last 24 hours is also returned for many trends if this is available. Ref: https://developer.twitter.com/en/docs/twitter-api/v1/trends/trends-for-location/api-reference/get-trends-place

GET trends/available

Get locations with trending topics Returns the locations that Twitter has trending topic information for. The response is an array of "locations" that encode the location's WOEID and some other human-readable information such as a canonical name and country the location belongs in. Note: This endpoint uses the "where on earth identifier" or WOEID, which is a legacy identifier created by Yahoo and has been deprecated. Twitter API v1.1 still uses the numeric value to identify town and country trend locations. Reference our legacy blog post for more details. The url returned in the response, where.yahooapis.com is no longer valid. Ref: https://developer.twitter.com/en/docs/twitter-api/v1/trends/locations-with-trending-topics/api-reference/get-trends-available

Resource URL

https://api.twitter.com/1.1/trends/available.json

Resource Information

Response formats: JSON Requires authentication? Yes Rate limited? Yes Requests / 15-min window (user auth): 75 Requests / 15-min window (app auth): 75

Application to get the "Elevated Access"

Q: How will you use the Twitter API or Twitter Data? In your words. In English, please describe how you plan to use Twitter data and/or APIs. The more detailed the response, the easier it is to review and approve. Learn what specific information to include in your use case Ans: ...

The specifics

Please answer each of the following with as much detail and accuracy as possible. Failure to do so could result in delays to your access to Twitter developer platform or rejected applications. Need help? Get support now. Q: Are you planning to analyze Twitter data? If yes, please describe how you will analyze Twitter data including any analysis of Tweets or Twitter users. Ans: No Q: Will your App use Tweet, Retweet, Like, Follow, or Direct Message functionality? Ans: No Q: Do you plan to display Tweets or aggregate data about Twitter content outside Twitter? Ans: No Q: Will your product, service, or analysis make Twitter content or derived information available to a government entity? In general, schools, college, and universities do not fall under this category. Ans: No Ref: URL: https://developer.twitter.com/en/portal/petition/standard/intent ~ ~ ~

Installation

Issue with 'conda-forge' repo and 'twitter' Python package

(base) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop/ws/gh/others_work$ conda activate rasa_py38 (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop/ws/gh/others_work$ conda install twitter -c conda-forge Collecting package metadata (current_repodata.json): done Solving environment: failed with initial frozen solve. Retrying with flexible solve. Collecting package metadata (repodata.json): done Solving environment: failed with initial frozen solve. Retrying with flexible solve. PackagesNotFoundError: The following packages are not available from current channels: - twitter Current channels: - https://conda.anaconda.org/conda-forge/linux-64 - https://conda.anaconda.org/conda-forge/noarch - https://repo.anaconda.com/pkgs/main/linux-64 - https://repo.anaconda.com/pkgs/main/noarch - https://repo.anaconda.com/pkgs/r/linux-64 - https://repo.anaconda.com/pkgs/r/noarch To search for alternate channels that may provide the conda package you're looking for, navigate to https://anaconda.org and use the search bar at the top of the page. (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop/ws/gh/others_work$ pip3 install twitter Collecting twitter Downloading twitter-1.19.6-py2.py3-none-any.whl (50 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 50.3/50.3 kB 133.6 kB/s eta 0:00:00 Requirement already satisfied: certifi in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from twitter) (2022.9.14) Installing collected packages: twitter Successfully installed twitter-1.19.6 (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop/ws/gh/others_work$

Code

File: recipe__oauth_login.py

# -*- coding: utf-8 -*- import os import sys import twitter from twitter.oauth import write_token_file, read_token_file from twitter.oauth_dance import oauth_dance # Go to http://twitter.com/apps/new to create an app and get these items # Error 404 (Dated: 20220920) # For URL: http://dev.twitter.com/pages/oauth_single_token APP_NAME = 'e***1' CONSUMER_KEY = 'I***9' CONSUMER_SECRET = 'B***U' def oauth_login(app_name=APP_NAME, consumer_key=CONSUMER_KEY, consumer_secret=CONSUMER_SECRET, token_file='out/twitter.oauth'): try: (access_token, access_token_secret) = read_token_file(token_file) except Exception as e: # FileNotFoundError, IOError (access_token, access_token_secret) = oauth_dance(app_name, consumer_key, consumer_secret) if not os.path.isdir('out'): os.mkdir('out') write_token_file(token_file, access_token, access_token_secret) print(sys.stderr, "OAuth Success. Token file stored to", token_file) return twitter.Twitter(auth=twitter.oauth.OAuth(access_token, access_token_secret, consumer_key, consumer_secret)) if __name__ == '__main__': oauth_login(APP_NAME, CONSUMER_KEY, CONSUMER_SECRET)

File: recipe__make_twitter_request.py

# -*- coding: utf-8 -*- import sys import time #from urllib2 import URLError from urllib.error import URLError import twitter # See recipe__get_friends_followers.py for an example of how you might use # make_twitter_request to do something like harvest a bunch of friend ids for a user def make_twitter_request(t, twitterFunction, max_errors=3, *args, **kwArgs): # A nested function for handling common HTTPErrors. Return an updated value # for wait_period if the problem is a 503 error. Block until the rate limit is # reset if a rate limiting issue def handle_http_error(e, t, wait_period=2): if wait_period > 3600: # Seconds print >> sys.stderr, 'Too many retries. Quitting.' raise e if e.e.code == 401: print >> sys.stderr, 'Encountered 401 Error (Not Authorized)' return None if e.e.code in (502, 503): print >> sys.stderr, 'Encountered %i Error. Will retry in %i seconds' % \ (e.e.code, wait_period) time.sleep(wait_period) wait_period *= 1.5 return wait_period # Rate limit exceeded. Wait 15 mins. See https://dev.twitter.com/docs/rate-limiting/1.1/limits if e.e.code == 429: now = time.time() # UTC sleep_time = 15*60 # 15 mins print >> sys.stderr, 'Rate limit reached: sleeping for 15 mins' time.sleep(sleep_time) return 0 # What else can you do? raise e wait_period = 2 error_count = 0 while True: try: return twitterFunction(*args, **kwArgs) except twitter.api.TwitterHTTPError as e: error_count = 0 wait_period = handle_http_error(e, t, wait_period) if wait_period is None: return except URLError as e: error_count += 1 print >> sys.stderr, "URLError encountered. Continuing." if error_count > max_errors: print >> sys.stderr, "Too many consecutive errors...bailing out." raise

File: recipe__get_user_info.py

# -*- coding: utf-8 -*- from recipe__oauth_login import oauth_login from recipe__make_twitter_request import make_twitter_request # Assume ids have been fetched from a scenario such as the # one presented in recipe__get_friends_followers.py and that # t is an authenticated instance of twitter.Twitter def get_info_by_id(t, ids): id_to_info = {} while len(ids) > 0: # Process 100 ids at a time... ids_str = ','.join([str(_id) for _id in ids[:100]]) ids = ids[100:] response = make_twitter_request(t, getattr(getattr(t, "users"), "lookup"), user_id=ids_str) if response is None: break if type(response) is dict: # Handle Twitter API quirk response = [response] for user_info in response: id_to_info[user_info['id']] = user_info return id_to_info # Similarly, you could resolve the same information by screen name # using code that's virtually identical. These two functions # could easily be combined. def get_info_by_screen_name(t, screen_names): sn_to_info = {} while len(screen_names) > 0: # Process 100 ids at a time... screen_names_str = ','.join([str(sn) for sn in screen_names[:100]]) screen_names = screen_names[100:] response = make_twitter_request(t, getattr(getattr(t, "users"), "lookup"), screen_name=screen_names_str) if response is None: break if type(response) is dict: # Handle Twitter API quirk response = [response] for user_info in response: sn_to_info[user_info['screen_name']] = user_info return sn_to_info if __name__ == '__main__': # Be sure to pass in any necessary keyword parameters # if you don't have a token already stored on file t = oauth_login() # Basic usage... info = {} info.update(get_info_by_screen_name(t, ['ptwobrussell', 'socialwebmining'])) info.update(get_info_by_id(t, ['2384071'])) # Do something useful with the profile information like store it to disk import json print(json.dumps(info, indent=1))

File: Jupyter Notebook

import requests import json headers = {'Authorization': 'Bearer A...V'} r = requests.get("https://api.twitter.com/1.1/trends/available.json", headers = headers) with open('trends_available_20220920.json', mode = 'w', encoding = 'utf8') as f: f.write(json.dumps(r.json())) trends_available = r.json() # WOEID for Delhi is: 20070458 headers = {'Authorization': 'Bearer A...V'} url = 'https://api.twitter.com/1.1/trends/place.json?id={}'.format(20070458) r = requests.get(url, headers = headers) trends = r.json()[0]['trends'] # -*- coding: utf-8 -*- import sys import json import twitter from recipe__oauth_login import oauth_login def search(t, q=None, max_batches=5, count=100): # See https://dev.twitter.com/docs/api/1.1/get/search/tweets search_results = t.search.tweets(q=q, count=count) statuses = search_results['statuses'] # Iterate through more batches of results by following the cursor for _ in range(max_batches): try: next_results = search_results['search_metadata']['next_results'] except KeyError as e: # No more results when next_results doesn't exist break # Create a dictionary from next_results, which has the following form: # ?max_id=313519052523986943&q=%23MentionSomeoneImportantForYou&include_entities=1 kwargs = dict([ kv.split('=') for kv in next_results[1:].split("&") ]) search_results = twitter_api.search.tweets(**kwargs) statuses += search_results['statuses'] return statuses if __name__ == '__main__': Q = ' '.join(sys.argv[1:]) t = oauth_login() statuses = search(t, q=Q) print(json.dumps(statuses, indent=1)) Hi there! We're gonna get you all set up to use exploring_twitter_api_1. Opening: https://api.twitter.com/oauth/authorize?oauth_token=clYlGQAAAAABfNOPAAABg1mCrT0 In the web browser window that opens please choose to Allow access. Copy the PIN number that appears on the next page and paste or type it here: Please enter the PIN: 2153380 <ipykernel.iostream.OutStream object at 0x7f310c1985e0> OAuth Success. Token file stored to out/twitter.oauth [] t = oauth_login() trends[0] {'name': '#StockMarketindia', 'url': 'http://twitter.com/search?q=%23StockMarketindia', 'promoted_content': None, 'query': '%23StockMarketindia', 'tweet_volume': None} statuses = search(t, q=trends[0]['query']) statuses[0]['text'] 'RT @yadav4priya: #StockMarketindia : #Nifty today may open above 17720 and move towards 17792 which is break out point. if sustains above…' statuses[0]['id'] 1572106127546122241 headers = {'Authorization': 'Bearer A...V'} url = 'https://api.twitter.com/2/tweets/{}'.format(1572106127546122241) r = requests.get(url, headers = headers) print(r.json()) {'data': {'id': '1572106127546122241', 'text': 'RT @yadav4priya: #StockMarketindia : #Nifty today may open above 17720 and move towards 17792 which is break out point. if sustains above…'}} from recipe__get_user_info import get_info_by_screen_name, get_info_by_id info = {} info.update(get_info_by_screen_name(t, ['ptwobrussell', 'socialwebmining'])) info.update(get_info_by_id(t, ['2384071'])) # Tim Oreilly print(info) {'ptwobrussell': {'id': 13085242, 'id_str': '13085242', 'name': 'Matthew Russell', 'screen_name': 'ptwobrussell', 'location': 'Franklin, TN', 'description': 'Maximizing Human Performance @ Strongest AI', 'url': 'https://t.co/aADFSpjQV2', 'entities': {'url': {'urls': [{'url': 'https://t.co/aADFSpjQV2', 'expanded_url': 'https://strongest.com', 'display_url': 'strongest.com', 'indices': [0, 23]}]}, 'description': {'urls': []}}, 'protected': False, 'followers_count': 1887, 'friends_count': 156, 'listed_count': 158, 'created_at': 'Tue Feb 05 08:16:12 +0000 2008', 'favourites_count': 878, 'utc_offset': None, 'time_zone': None, 'geo_enabled': False, 'verified': False, 'statuses_count': 1495, 'lang': None, 'status': {'created_at': 'Thu Jun 23 14:59:11 +0000 2022', 'id': 1539986432080609281, 'id_str': '1539986432080609281', 'text': 'This #MensHealthMonth I’m committing to helping break down stigma by supporting and talking about men’s mental heal… https://t.co/paXZ52a69F', 'truncated': True, 'entities': {'hashtags': [{'text': 'MensHealthMonth', 'indices': [5, 21]}], 'symbols': [], 'user_mentions': [], 'urls': [{'url': 'https://t.co/paXZ52a69F', 'expanded_url': 'https://twitter.com/i/web/status/1539986432080609281', 'display_url': 'twitter.com/i/web/status/1…', 'indices': [117, 140]}]}, 'source': '<a href="https://hypefury.com" rel="nofollow">Hypefury</a>', 'in_reply_to_status_id': 1539986423603990528, 'in_reply_to_status_id_str': '1539986423603990528', 'in_reply_to_user_id': 13085242, 'in_reply_to_user_id_str': '13085242', 'in_reply_to_screen_name': 'ptwobrussell', 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': True, 'quoted_status_id': 1539986423603990528, 'quoted_status_id_str': '1539986423603990528', 'retweet_count': 0, 'favorite_count': 0, 'favorited': False, 'retweeted': False, 'possibly_sensitive': False, 'lang': 'en'}, 'contributors_enabled': False, 'is_translator': False, 'is_translation_enabled': False, 'profile_background_color': '888888', 'profile_background_image_url': 'http://abs.twimg.com/images/themes/theme1/bg.png', 'profile_background_image_url_https': 'https://abs.twimg.com/images/themes/theme1/bg.png', 'profile_background_tile': False, 'profile_image_url': 'http://pbs.twimg.com/profile_images/1518238159477387274/iEwIp3Rq_normal.jpg', 'profile_image_url_https': 'https://pbs.twimg.com/profile_images/1518238159477387274/iEwIp3Rq_normal.jpg', 'profile_banner_url': 'https://pbs.twimg.com/profile_banners/13085242/1650811824', 'profile_link_color': 'EE8336', 'profile_sidebar_border_color': '888888', 'profile_sidebar_fill_color': 'F7F7F7', 'profile_text_color': '333333', 'profile_use_background_image': True, 'has_extended_profile': False, 'default_profile': False, 'default_profile_image': False, 'following': False, 'follow_request_sent': False, 'notifications': False, 'translator_type': 'regular', 'withheld_in_countries': []}, 'SocialWebMining': {'id': 132373965, 'id_str': '132373965', 'name': 'MiningTheSocialWeb', 'screen_name': 'SocialWebMining', 'location': '', 'description': 'Get the source code at GitHub: http://t.co/U0VmWrXpB9', 'url': 'http://t.co/CJfJDyM6ki', 'entities': {'url': {'urls': [{'url': 'http://t.co/CJfJDyM6ki', 'expanded_url': 'http://miningthesocialweb.com', 'display_url': 'miningthesocialweb.com', 'indices': [0, 22]}]}, 'description': {'urls': [{'url': 'http://t.co/U0VmWrXpB9', 'expanded_url': 'http://bit.ly/MiningTheSocialWeb2E', 'display_url': 'bit.ly/MiningTheSocia…', 'indices': [31, 53]}]}}, 'protected': False, 'followers_count': 4184, 'friends_count': 0, 'listed_count': 200, 'created_at': 'Tue Apr 13 02:10:40 +0000 2010', 'favourites_count': 33, 'utc_offset': None, 'time_zone': None, 'geo_enabled': False, 'verified': False, 'statuses_count': 778, 'lang': None, 'status': {'created_at': 'Mon Jan 28 14:06:01 +0000 2019', 'id': 1089887323116969985, 'id_str': '1089887323116969985', 'text': 'What did it take to write the new edition? Well, trying to keep up with a changing social media landscape, for one.… https://t.co/vsaR6B4smZ', 'truncated': True, 'entities': {'hashtags': [], 'symbols': [], 'user_mentions': [], 'urls': [{'url': 'https://t.co/vsaR6B4smZ', 'expanded_url': 'https://twitter.com/i/web/status/1089887323116969985', 'display_url': 'twitter.com/i/web/status/1…', 'indices': [117, 140]}]}, 'source': '<a href="https://buffer.com" rel="nofollow">Buffer</a>', 'in_reply_to_status_id': None, 'in_reply_to_status_id_str': None, 'in_reply_to_user_id': None, 'in_reply_to_user_id_str': None, 'in_reply_to_screen_name': None, 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': False, 'retweet_count': 5, 'favorite_count': 19, 'favorited': False, 'retweeted': False, 'possibly_sensitive': False, 'lang': 'en'}, 'contributors_enabled': False, 'is_translator': False, 'is_translation_enabled': False, 'profile_background_color': '352726', 'profile_background_image_url': 'http://abs.twimg.com/images/themes/theme5/bg.gif', 'profile_background_image_url_https': 'https://abs.twimg.com/images/themes/theme5/bg.gif', 'profile_background_tile': False, 'profile_image_url': 'http://pbs.twimg.com/profile_images/1154493071/Picture_7_normal.png', 'profile_image_url_https': 'https://pbs.twimg.com/profile_images/1154493071/Picture_7_normal.png', 'profile_link_color': 'D02B55', 'profile_sidebar_border_color': '829D5E', 'profile_sidebar_fill_color': '99CC33', 'profile_text_color': '3E4415', 'profile_use_background_image': True, 'has_extended_profile': False, 'default_profile': False, 'default_profile_image': False, 'following': False, 'follow_request_sent': False, 'notifications': False, 'translator_type': 'none', 'withheld_in_countries': []}, 2384071: {'id': 2384071, 'id_str': '2384071', 'name': 'timoreilly', 'screen_name': 'timoreilly', 'location': 'Oakland, CA', 'description': "Founder and CEO, O'Reilly Media. Watching the alpha geeks, sharing their stories, helping the future unfold.", 'url': 'https://t.co/gkALoGFXiQ', 'entities': {'url': {'urls': [{'url': 'https://t.co/gkALoGFXiQ', 'expanded_url': 'http://tim.oreilly.com', 'display_url': 'tim.oreilly.com', 'indices': [0, 23]}]}, 'description': {'urls': []}}, 'protected': False, 'followers_count': 1691407, 'friends_count': 2184, 'listed_count': 25717, 'created_at': 'Tue Mar 27 01:14:05 +0000 2007', 'favourites_count': 29491, 'utc_offset': None, 'time_zone': None, 'geo_enabled': True, 'verified': True, 'statuses_count': 48140, 'lang': None, 'status': {'created_at': 'Mon Sep 19 22:57:57 +0000 2022', 'id': 1571997048215605250, 'id_str': '1571997048215605250', 'text': '@Norro21 I do.', 'truncated': False, 'entities': {'hashtags': [], 'symbols': [], 'user_mentions': [{'screen_name': 'Norro21', 'name': 'Ben Norris', 'id': 57363041, 'id_str': '57363041', 'indices': [0, 8]}], 'urls': []}, 'source': '<a href="http://twitter.com/download/android" rel="nofollow">Twitter for Android</a>', 'in_reply_to_status_id': 1571994357313437697, 'in_reply_to_status_id_str': '1571994357313437697', 'in_reply_to_user_id': 57363041, 'in_reply_to_user_id_str': '57363041', 'in_reply_to_screen_name': 'Norro21', 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': False, 'retweet_count': 0, 'favorite_count': 0, 'favorited': False, 'retweeted': False, 'lang': 'und'}, 'contributors_enabled': False, 'is_translator': False, 'is_translation_enabled': False, 'profile_background_color': '9AE4E8', 'profile_background_image_url': 'http://abs.twimg.com/images/themes/theme1/bg.png', 'profile_background_image_url_https': 'https://abs.twimg.com/images/themes/theme1/bg.png', 'profile_background_tile': False, 'profile_image_url': 'http://pbs.twimg.com/profile_images/1501564491670065152/aG3MNIpH_normal.jpg', 'profile_image_url_https': 'https://pbs.twimg.com/profile_images/1501564491670065152/aG3MNIpH_normal.jpg', 'profile_banner_url': 'https://pbs.twimg.com/profile_banners/2384071/1549416574', 'profile_link_color': '0000FF', 'profile_sidebar_border_color': '87BC44', 'profile_sidebar_fill_color': 'E0FF92', 'profile_text_color': '000000', 'profile_use_background_image': True, 'has_extended_profile': False, 'default_profile': False, 'default_profile_image': False, 'following': False, 'follow_request_sent': False, 'notifications': False, 'translator_type': 'none', 'withheld_in_countries': []}}
Tags: Python,Natural Language Processing,

Monday, September 19, 2022

Installing Rasa 3.2.8 on Ubuntu using Conda

Contents of env.yml

name: rasa_py38 channels: - conda-forge dependencies: - python=3.8 - pip - spacy (base) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ conda env create -f env.yml Collecting package metadata (repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.12.0 latest version: 4.14.0 Please update conda by running $ conda update -n base -c defaults conda Downloading and Extracting Packages libsqlite-3.39.3 | 789 KB | ### | 100% numpy-1.23.3 | 7.1 MB | ### | 100% dataclasses-0.8 | 10 KB | ### | 100% pathy-0.6.2 | 38 KB | ### | 100% srsly-2.4.4 | 540 KB | ### | 100% sqlite-3.39.3 | 789 KB | ### | 100% python-3.8.13 | 25.1 MB | ### | 100% markupsafe-2.1.1 | 22 KB | ### | 100% spacy-loggers-1.0.3 | 13 KB | ### | 100% cython-blis-0.7.8 | 9.0 MB | ### | 100% cymem-2.0.6 | 42 KB | ### | 100% cffi-1.15.1 | 229 KB | ### | 100% click-8.1.3 | 146 KB | ### | 100% langcodes-3.3.0 | 156 KB | ### | 100% cryptography-37.0.4 | 1.5 MB | ### | 100% tqdm-4.64.1 | 82 KB | ### | 100% shellingham-1.5.0 | 12 KB | ### | 100% brotlipy-0.7.0 | 342 KB | ### | 100% spacy-3.4.1 | 6.4 MB | ### | 100% preshed-3.0.7 | 122 KB | ### | 100% wasabi-0.10.0 | 26 KB | ### | 100% python_abi-3.8 | 4 KB | ### | 100% libzlib-1.2.12 | 65 KB | ### | 100% smart_open-5.2.1 | 43 KB | ### | 100% thinc-8.1.0 | 904 KB | ### | 100% typer-0.4.2 | 45 KB | ### | 100% pydantic-1.9.2 | 2.4 MB | ### | 100% spacy-legacy-3.0.10 | 20 KB | ### | 100% catalogue-2.0.8 | 32 KB | ### | 100% murmurhash-1.0.8 | 27 KB | ### | 100% libopenblas-0.3.21 | 10.1 MB | ### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done # # To activate this environment, use # # $ conda activate rasa_py38 # # To deactivate an active environment, use # # $ conda deactivate (base) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ (base) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ conda activate rasa_py38 (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ conda install ipykernel jupyterlab -c conda-forge Collecting package metadata (current_repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.12.0 latest version: 4.14.0 Please update conda by running $ conda update -n base -c defaults conda ## Package Plan ## environment location: /home/ashish/anaconda3/envs/rasa_py38 added / updated specs: - ipykernel - jupyterlab The following packages will be downloaded: package | build ---------------------------|----------------- argon2-cffi-bindings-21.2.0| py38h0a891b7_2 34 KB conda-forge debugpy-1.6.3 | py38hfa26641_0 2.0 MB conda-forge importlib-metadata-4.11.4 | py38h578d9bd_0 33 KB conda-forge ipykernel-6.15.3 | pyh210e3f2_0 99 KB conda-forge ipython-8.5.0 | pyh41d4057_1 552 KB conda-forge jsonschema-4.16.0 | pyhd8ed1ab_0 65 KB conda-forge jupyter_core-4.11.1 | py38h578d9bd_0 81 KB conda-forge jupyterlab-3.4.7 | pyhd8ed1ab_0 5.9 MB conda-forge lxml-4.9.1 | py38h0a891b7_0 1.4 MB conda-forge nbclient-0.6.8 | pyhd8ed1ab_0 65 KB conda-forge nbformat-5.5.0 | pyhd8ed1ab_0 105 KB conda-forge prompt-toolkit-3.0.31 | pyha770c72_0 254 KB conda-forge psutil-5.9.2 | py38h0a891b7_0 346 KB conda-forge pyrsistent-0.18.1 | py38h0a891b7_1 92 KB conda-forge pyzmq-24.0.0 | py38hfc09fa9_0 499 KB conda-forge terminado-0.15.0 | py38h578d9bd_0 28 KB conda-forge tomli-2.0.1 | pyhd8ed1ab_0 16 KB conda-forge tornado-6.2 | py38h0a891b7_0 653 KB conda-forge traitlets-5.4.0 | pyhd8ed1ab_0 85 KB conda-forge ------------------------------------------------------------ Total: 12.3 MB The following NEW packages will be INSTALLED: anyio conda-forge/noarch::anyio-3.6.1-pyhd8ed1ab_1 argon2-cffi conda-forge/noarch::argon2-cffi-21.3.0-pyhd8ed1ab_0 argon2-cffi-bindi~ conda-forge/linux-64::argon2-cffi-bindings-21.2.0-py38h0a891b7_2 asttokens conda-forge/noarch::asttokens-2.0.8-pyhd8ed1ab_0 attrs conda-forge/noarch::attrs-22.1.0-pyh71513ae_1 babel conda-forge/noarch::babel-2.10.3-pyhd8ed1ab_0 backcall conda-forge/noarch::backcall-0.2.0-pyh9f0ad1d_0 backports conda-forge/noarch::backports-1.0-py_2 backports.functoo~ conda-forge/noarch::backports.functools_lru_cache-1.6.4-pyhd8ed1ab_0 beautifulsoup4 conda-forge/noarch::beautifulsoup4-4.11.1-pyha770c72_0 bleach conda-forge/noarch::bleach-5.0.1-pyhd8ed1ab_0 debugpy conda-forge/linux-64::debugpy-1.6.3-py38hfa26641_0 decorator conda-forge/noarch::decorator-5.1.1-pyhd8ed1ab_0 defusedxml conda-forge/noarch::defusedxml-0.7.1-pyhd8ed1ab_0 entrypoints conda-forge/noarch::entrypoints-0.4-pyhd8ed1ab_0 executing conda-forge/noarch::executing-1.0.0-pyhd8ed1ab_0 flit-core conda-forge/noarch::flit-core-3.7.1-pyhd8ed1ab_0 icu conda-forge/linux-64::icu-70.1-h27087fc_0 importlib-metadata conda-forge/linux-64::importlib-metadata-4.11.4-py38h578d9bd_0 importlib_metadata conda-forge/noarch::importlib_metadata-4.11.4-hd8ed1ab_0 importlib_resourc~ conda-forge/noarch::importlib_resources-5.9.0-pyhd8ed1ab_0 ipykernel conda-forge/noarch::ipykernel-6.15.3-pyh210e3f2_0 ipython conda-forge/noarch::ipython-8.5.0-pyh41d4057_1 ipython_genutils conda-forge/noarch::ipython_genutils-0.2.0-py_1 jedi conda-forge/noarch::jedi-0.18.1-pyhd8ed1ab_2 json5 conda-forge/noarch::json5-0.9.5-pyh9f0ad1d_0 jsonschema conda-forge/noarch::jsonschema-4.16.0-pyhd8ed1ab_0 jupyter_client conda-forge/noarch::jupyter_client-7.3.5-pyhd8ed1ab_0 jupyter_core conda-forge/linux-64::jupyter_core-4.11.1-py38h578d9bd_0 jupyter_server conda-forge/noarch::jupyter_server-1.18.1-pyhd8ed1ab_0 jupyterlab conda-forge/noarch::jupyterlab-3.4.7-pyhd8ed1ab_0 jupyterlab_pygmen~ conda-forge/noarch::jupyterlab_pygments-0.2.2-pyhd8ed1ab_0 jupyterlab_server conda-forge/noarch::jupyterlab_server-2.15.1-pyhd8ed1ab_0 libiconv conda-forge/linux-64::libiconv-1.16-h516909a_0 libsodium conda-forge/linux-64::libsodium-1.0.18-h36c2ea0_1 libxml2 conda-forge/linux-64::libxml2-2.9.14-h22db469_4 libxslt conda-forge/linux-64::libxslt-1.1.35-h8affb1d_0 lxml conda-forge/linux-64::lxml-4.9.1-py38h0a891b7_0 matplotlib-inline conda-forge/noarch::matplotlib-inline-0.1.6-pyhd8ed1ab_0 mistune conda-forge/noarch::mistune-2.0.4-pyhd8ed1ab_0 nbclassic conda-forge/noarch::nbclassic-0.4.3-pyhd8ed1ab_0 nbclient conda-forge/noarch::nbclient-0.6.8-pyhd8ed1ab_0 nbconvert conda-forge/noarch::nbconvert-7.0.0-pyhd8ed1ab_0 nbconvert-core conda-forge/noarch::nbconvert-core-7.0.0-pyhd8ed1ab_0 nbconvert-pandoc conda-forge/noarch::nbconvert-pandoc-7.0.0-pyhd8ed1ab_0 nbformat conda-forge/noarch::nbformat-5.5.0-pyhd8ed1ab_0 nest-asyncio conda-forge/noarch::nest-asyncio-1.5.5-pyhd8ed1ab_0 notebook conda-forge/noarch::notebook-6.4.12-pyha770c72_0 notebook-shim conda-forge/noarch::notebook-shim-0.1.0-pyhd8ed1ab_0 pandoc conda-forge/linux-64::pandoc-2.19.2-ha770c72_0 pandocfilters conda-forge/noarch::pandocfilters-1.5.0-pyhd8ed1ab_0 parso conda-forge/noarch::parso-0.8.3-pyhd8ed1ab_0 pexpect conda-forge/noarch::pexpect-4.8.0-pyh9f0ad1d_2 pickleshare conda-forge/noarch::pickleshare-0.7.5-py_1003 pkgutil-resolve-n~ conda-forge/noarch::pkgutil-resolve-name-1.3.10-pyhd8ed1ab_0 prometheus_client conda-forge/noarch::prometheus_client-0.14.1-pyhd8ed1ab_0 prompt-toolkit conda-forge/noarch::prompt-toolkit-3.0.31-pyha770c72_0 psutil conda-forge/linux-64::psutil-5.9.2-py38h0a891b7_0 ptyprocess conda-forge/noarch::ptyprocess-0.7.0-pyhd3deb0d_0 pure_eval conda-forge/noarch::pure_eval-0.2.2-pyhd8ed1ab_0 pygments conda-forge/noarch::pygments-2.13.0-pyhd8ed1ab_0 pyrsistent conda-forge/linux-64::pyrsistent-0.18.1-py38h0a891b7_1 python-dateutil conda-forge/noarch::python-dateutil-2.8.2-pyhd8ed1ab_0 python-fastjsonsc~ conda-forge/noarch::python-fastjsonschema-2.16.1-pyhd8ed1ab_0 pytz conda-forge/noarch::pytz-2022.2.1-pyhd8ed1ab_0 pyzmq conda-forge/linux-64::pyzmq-24.0.0-py38hfc09fa9_0 send2trash conda-forge/noarch::send2trash-1.8.0-pyhd8ed1ab_0 six conda-forge/noarch::six-1.16.0-pyh6c4a22f_0 sniffio conda-forge/noarch::sniffio-1.3.0-pyhd8ed1ab_0 soupsieve conda-forge/noarch::soupsieve-2.3.2.post1-pyhd8ed1ab_0 stack_data conda-forge/noarch::stack_data-0.5.0-pyhd8ed1ab_0 terminado conda-forge/linux-64::terminado-0.15.0-py38h578d9bd_0 tinycss2 conda-forge/noarch::tinycss2-1.1.1-pyhd8ed1ab_0 tomli conda-forge/noarch::tomli-2.0.1-pyhd8ed1ab_0 tornado conda-forge/linux-64::tornado-6.2-py38h0a891b7_0 traitlets conda-forge/noarch::traitlets-5.4.0-pyhd8ed1ab_0 wcwidth conda-forge/noarch::wcwidth-0.2.5-pyh9f0ad1d_2 webencodings conda-forge/noarch::webencodings-0.5.1-py_1 websocket-client conda-forge/noarch::websocket-client-1.4.1-pyhd8ed1ab_0 zeromq conda-forge/linux-64::zeromq-4.3.4-h9c3ff4c_1 zipp conda-forge/noarch::zipp-3.8.1-pyhd8ed1ab_0 Proceed ([y]/n)? y Downloading and Extracting Packages traitlets-5.4.0 | 85 KB | ### | 100% nbclient-0.6.8 | 65 KB | ### | 100% ipykernel-6.15.3 | 99 KB | ### | 100% tomli-2.0.1 | 16 KB | ### | 100% ipython-8.5.0 | 552 KB | ### | 100% jsonschema-4.16.0 | 65 KB | ### | 100% jupyter_core-4.11.1 | 81 KB | ### | 100% psutil-5.9.2 | 346 KB | ### | 100% lxml-4.9.1 | 1.4 MB | ### | 100% jupyterlab-3.4.7 | 5.9 MB | ### | 100% terminado-0.15.0 | 28 KB | ### | 100% nbformat-5.5.0 | 105 KB | ### | 100% argon2-cffi-bindings | 34 KB | ### | 100% debugpy-1.6.3 | 2.0 MB | ### | 100% pyrsistent-0.18.1 | 92 KB | ### | 100% importlib-metadata-4 | 33 KB | ### | 100% pyzmq-24.0.0 | 499 KB | ### | 100% tornado-6.2 | 653 KB | ### | 100% prompt-toolkit-3.0.3 | 254 KB | ### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ python -m ipykernel install --user --name rasa_py38 Installed kernelspec rasa_py38 in /home/ashish/.local/share/jupyter/kernels/rasa_py38 (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ pip3 install rasa --user Collecting rasa Downloading rasa-3.2.8-py3-none-any.whl (819 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 819.2/819.2 kB 320.6 kB/s eta 0:00:00 Collecting coloredlogs<16,>=10 Downloading coloredlogs-15.0.1-py2.py3-none-any.whl (46 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 46.0/46.0 kB 209.8 kB/s eta 0:00:00 Collecting pykwalify<1.9,>=1.7 Downloading pykwalify-1.8.0-py2.py3-none-any.whl (24 kB) Collecting python-socketio<6,>=4.4 Downloading python_socketio-5.7.1-py3-none-any.whl (56 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 56.6/56.6 kB 234.4 kB/s eta 0:00:00 Collecting dask==2022.2.0 Downloading dask-2022.2.0-py3-none-any.whl (1.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.1/1.1 MB 287.9 kB/s eta 0:00:00 Collecting questionary<1.11.0,>=1.5.1 Downloading questionary-1.10.0-py3-none-any.whl (31 kB) Collecting colorclass<2.3,>=2.2 Downloading colorclass-2.2.2-py2.py3-none-any.whl (18 kB) Collecting twilio<6.51,>=6.26 Downloading twilio-6.50.1.tar.gz (457 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 457.9/457.9 kB 272.9 kB/s eta 0:00:00 Preparing metadata (setup.py) ... done Collecting regex<2022.5,>=2020.6 Downloading regex-2022.4.24-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (764 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 764.9/764.9 kB 324.5 kB/s eta 0:00:00 Collecting slackclient<3.0.0,>=2.0.0 Downloading slackclient-2.9.4-py2.py3-none-any.whl (97 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 97.1/97.1 kB 196.5 kB/s eta 0:00:00 Collecting aio-pika<9.0.0,>=6.7.1 Downloading aio-pika-8.2.1.tar.gz (42 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 42.3/42.3 kB 218.7 kB/s eta 0:00:00 Preparing metadata (setup.py) ... done Collecting python-engineio!=5.0.0,<6,>=4 Downloading python_engineio-4.3.4-py3-none-any.whl (52 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 52.9/52.9 kB 281.0 kB/s eta 0:00:00 Collecting typing-extensions<4.0.0,>=3.7.4 Downloading typing_extensions-3.10.0.2-py3-none-any.whl (26 kB) Collecting apscheduler<3.8,>=3.6 Downloading APScheduler-3.7.0-py2.py3-none-any.whl (59 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 59.3/59.3 kB 246.7 kB/s eta 0:00:00 Requirement already satisfied: requests<3.0,>=2.23 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from rasa) (2.28.1) Collecting rasa-sdk<3.3.0,>=3.2.0 Downloading rasa_sdk-3.2.1-py3-none-any.whl (41 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 41.0/41.0 kB 290.6 kB/s eta 0:00:00 Collecting numpy<1.20.0,>=1.19.2 Downloading numpy-1.19.5-cp38-cp38-manylinux2010_x86_64.whl (14.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.9/14.9 MB 308.9 kB/s eta 0:00:00 Collecting pytz<2022.0,>=2019.1 Downloading pytz-2021.3-py2.py3-none-any.whl (503 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 503.5/503.5 kB 323.7 kB/s eta 0:00:00 Requirement already satisfied: setuptools>=41.0.0 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from rasa) (65.3.0) Collecting google-auth<2 Downloading google_auth-1.35.0-py2.py3-none-any.whl (152 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 152.9/152.9 kB 247.8 kB/s eta 0:00:00 Requirement already satisfied: tqdm<5.0,>=4.31 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from rasa) (4.64.1) Collecting CacheControl<0.13.0,>=0.12.9 Downloading CacheControl-0.12.11-py2.py3-none-any.whl (21 kB) Collecting scikit-learn<0.25,>=0.22 Downloading scikit_learn-0.24.2-cp38-cp38-manylinux2010_x86_64.whl (24.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 24.9/24.9 MB 288.4 kB/s eta 0:00:00 Collecting tensorflow-addons<0.16.0,>=0.15.0 Downloading tensorflow_addons-0.15.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.1/1.1 MB 323.8 kB/s eta 0:00:00 Collecting joblib<1.1.0,>=0.15.1 Downloading joblib-1.0.1-py3-none-any.whl (303 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 303.1/303.1 kB 310.5 kB/s eta 0:00:00 Collecting attrs<21.3,>=19.3 Downloading attrs-21.2.0-py2.py3-none-any.whl (53 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 53.7/53.7 kB 292.9 kB/s eta 0:00:00 Collecting pymongo[srv,tls]<3.11,>=3.8 Downloading pymongo-3.10.1-cp38-cp38-manylinux2014_x86_64.whl (480 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 480.1/480.1 kB 314.2 kB/s eta 0:00:00 Collecting sentry-sdk<1.4.0,>=0.17.0 Downloading sentry_sdk-1.3.1-py2.py3-none-any.whl (133 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.6/133.6 kB 247.9 kB/s eta 0:00:00 Collecting networkx<2.7,>=2.4 Downloading networkx-2.6.3-py3-none-any.whl (1.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.9/1.9 MB 338.2 kB/s eta 0:00:00 Collecting tensorflow-text<2.8.0,>=2.7.0 Downloading tensorflow_text-2.7.3-cp38-cp38-manylinux2010_x86_64.whl (4.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.9/4.9 MB 343.2 kB/s eta 0:00:00 Collecting kafka-python<3.0,>=1.4 Downloading kafka_python-2.0.2-py2.py3-none-any.whl (246 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 246.5/246.5 kB 335.8 kB/s eta 0:00:00 Collecting absl-py<0.14,>=0.9 Downloading absl_py-0.13.0-py3-none-any.whl (132 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.1/132.1 kB 241.4 kB/s eta 0:00:00 Collecting scipy<1.8.0,>=1.4.1 Downloading scipy-1.7.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (39.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 39.3/39.3 MB 297.4 kB/s eta 0:00:00 Collecting terminaltables<3.2.0,>=3.1.0 Downloading terminaltables-3.1.10-py2.py3-none-any.whl (15 kB) Collecting SQLAlchemy<1.5.0,>=1.4.0 Downloading SQLAlchemy-1.4.41-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 294.6 kB/s eta 0:00:00 Collecting packaging<21.0,>=20.0 Downloading packaging-20.9-py2.py3-none-any.whl (40 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 40.9/40.9 kB 157.0 kB/s eta 0:00:00 Collecting webexteamssdk<1.7.0,>=1.1.1 Downloading webexteamssdk-1.6.1-py3-none-any.whl (113 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 113.5/113.5 kB 259.2 kB/s eta 0:00:00 Requirement already satisfied: python-dateutil<2.9,>=2.8 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from rasa) (2.8.2) Collecting rocketchat_API<1.26.0,>=0.6.31 Downloading rocketchat_API-1.25.0-py3-none-any.whl (19 kB) Collecting fbmessenger<6.1.0,>=6.0.0 Downloading fbmessenger-6.0.0-py2.py3-none-any.whl (11 kB) Collecting ruamel.yaml<0.17.0,>=0.16.5 Downloading ruamel.yaml-0.16.13-py2.py3-none-any.whl (111 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 111.9/111.9 kB 339.1 kB/s eta 0:00:00 Collecting randomname<0.2.0,>=0.1.5 Downloading randomname-0.1.5.tar.gz (36 kB) Preparing metadata (setup.py) ... done Collecting colorhash<1.1.0,>=1.0.2 Downloading colorhash-1.0.4-py3-none-any.whl (5.5 kB) Collecting cloudpickle<1.7,>=1.2 Downloading cloudpickle-1.6.0-py3-none-any.whl (23 kB) Collecting matplotlib<3.4,>=3.1 Downloading matplotlib-3.3.4-cp38-cp38-manylinux1_x86_64.whl (11.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.6/11.6 MB 288.3 kB/s eta 0:00:00 Collecting prompt-toolkit<3.0.29,>=3.0 Downloading prompt_toolkit-3.0.28-py3-none-any.whl (380 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 380.2/380.2 kB 255.8 kB/s eta 0:00:00 Collecting pydot<1.5,>=1.4 Downloading pydot-1.4.2-py2.py3-none-any.whl (21 kB) Collecting boto3<2.0,>=1.12 Downloading boto3-1.24.75-py3-none-any.whl (132 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 132.5/132.5 kB 250.2 kB/s eta 0:00:00 Collecting redis<4.0,>=3.4 Downloading redis-3.5.3-py2.py3-none-any.whl (72 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 72.1/72.1 kB 267.3 kB/s eta 0:00:00 Collecting sanic<21.13,>=21.12 Downloading sanic-21.12.2-py3-none-any.whl (156 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 156.5/156.5 kB 276.9 kB/s eta 0:00:00 Collecting aiohttp!=3.7.4.post0,<3.8,>=3.6 Downloading aiohttp-3.7.4-cp38-cp38-manylinux2014_x86_64.whl (1.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 331.6 kB/s eta 0:00:00 Collecting sanic-routing<0.8.0,>=0.7.2 Downloading sanic_routing-0.7.2-py3-none-any.whl (23 kB) Collecting ujson<6.0,>=1.35 Downloading ujson-5.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (46 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 46.1/46.1 kB 178.6 kB/s eta 0:00:00 Collecting PyJWT[crypto]<3.0.0,>=2.0.0 Downloading PyJWT-2.5.0-py3-none-any.whl (20 kB) Collecting sanic-cors<2.1.0,>=2.0.0 Downloading Sanic_Cors-2.0.1-py2.py3-none-any.whl (17 kB) Collecting tensorflow<2.8.0,>=2.7.0 Downloading tensorflow-2.7.4-cp38-cp38-manylinux2010_x86_64.whl (496.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 496.0/496.0 MB 206.4 kB/s eta 0:00:00 Collecting pyTelegramBotAPI<5.0.0,>=3.7.3 Downloading pyTelegramBotAPI-4.7.0.tar.gz (210 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 210.7/210.7 kB 229.1 kB/s eta 0:00:00 Preparing metadata (setup.py) ... done Collecting jsonpickle<2.1,>=1.3 Downloading jsonpickle-2.0.0-py2.py3-none-any.whl (37 kB) Collecting sanic-jwt<2.0.0,>=1.6.0 Downloading sanic_jwt-1.8.0-py3-none-any.whl (23 kB) Collecting typing-utils<0.2.0,>=0.1.0 Downloading typing_utils-0.1.0-py3-none-any.whl (10 kB) Collecting mattermostwrapper<2.3,>=2.2 Downloading mattermostwrapper-2.2.tar.gz (2.5 kB) Preparing metadata (setup.py) ... done Collecting tarsafe<0.0.4,>=0.0.3 Downloading tarsafe-0.0.3-py3-none-any.whl (5.0 kB) Collecting jsonschema<4.5,>=3.2 Downloading jsonschema-4.4.0-py3-none-any.whl (72 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 72.7/72.7 kB 240.9 kB/s eta 0:00:00 Collecting psycopg2-binary<2.10.0,>=2.8.2 Downloading psycopg2_binary-2.9.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.0/3.0 MB 259.8 kB/s eta 0:00:00 Collecting tensorflow_hub<0.13.0,>=0.12.0 Downloading tensorflow_hub-0.12.0-py2.py3-none-any.whl (108 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 108.8/108.8 kB 232.1 kB/s eta 0:00:00 Collecting sklearn-crfsuite<0.4,>=0.3 Downloading sklearn_crfsuite-0.3.6-py2.py3-none-any.whl (12 kB) Collecting partd>=0.3.10 Downloading partd-1.3.0-py3-none-any.whl (18 kB) Collecting pyyaml>=5.3.1 Downloading PyYAML-6.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (701 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 701.2/701.2 kB 304.6 kB/s eta 0:00:00 Collecting toolz>=0.8.2 Downloading toolz-0.12.0-py3-none-any.whl (55 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 55.8/55.8 kB 243.2 kB/s eta 0:00:00 Collecting fsspec>=0.6.0 Downloading fsspec-2022.8.2-py3-none-any.whl (140 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 140.8/140.8 kB 278.9 kB/s eta 0:00:00 Requirement already satisfied: six in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from absl-py<0.14,>=0.9->rasa) (1.16.0) Collecting aiormq~=6.4.0 Downloading aiormq-6.4.2-py3-none-any.whl (34 kB) Collecting yarl Downloading yarl-1.8.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (262 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 262.1/262.1 kB 299.1 kB/s eta 0:00:00 Collecting chardet<4.0,>=2.0 Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.4/133.4 kB 308.3 kB/s eta 0:00:00 Collecting async-timeout<4.0,>=3.0 Downloading async_timeout-3.0.1-py3-none-any.whl (8.2 kB) Collecting multidict<7.0,>=4.5 Downloading multidict-6.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (121 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 121.3/121.3 kB 345.9 kB/s eta 0:00:00 Collecting tzlocal~=2.0 Downloading tzlocal-2.1-py2.py3-none-any.whl (16 kB) Collecting jmespath<2.0.0,>=0.7.1 Downloading jmespath-1.0.1-py3-none-any.whl (20 kB) Collecting botocore<1.28.0,>=1.27.75 Downloading botocore-1.27.75-py3-none-any.whl (9.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.1/9.1 MB 278.2 kB/s eta 0:00:00 Collecting s3transfer<0.7.0,>=0.6.0 Downloading s3transfer-0.6.0-py3-none-any.whl (79 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 79.6/79.6 kB 276.0 kB/s eta 0:00:00 Collecting msgpack>=0.5.2 Downloading msgpack-1.0.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (322 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 322.5/322.5 kB 266.5 kB/s eta 0:00:00 Collecting humanfriendly>=9.1 Downloading humanfriendly-10.0-py2.py3-none-any.whl (86 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 86.8/86.8 kB 254.5 kB/s eta 0:00:00 Collecting cachetools<5.0,>=2.0.0 Downloading cachetools-4.2.4-py3-none-any.whl (10 kB) Collecting pyasn1-modules>=0.2.1 Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 155.3/155.3 kB 261.2 kB/s eta 0:00:00 Collecting rsa<5,>=3.1.4 Downloading rsa-4.9-py3-none-any.whl (34 kB) Requirement already satisfied: pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from jsonschema<4.5,>=3.2->rasa) (0.18.1) Requirement already satisfied: importlib-resources>=1.4.0 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from jsonschema<4.5,>=3.2->rasa) (5.9.0) Collecting pillow>=6.2.0 Downloading Pillow-9.2.0-cp38-cp38-manylinux_2_28_x86_64.whl (3.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.2/3.2 MB 283.6 kB/s eta 0:00:00 Collecting kiwisolver>=1.0.1 Downloading kiwisolver-1.4.4-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 247.8 kB/s eta 0:00:00 Collecting cycler>=0.10 Downloading cycler-0.11.0-py3-none-any.whl (6.4 kB) Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.3 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from matplotlib<3.4,>=3.1->rasa) (3.0.9) Requirement already satisfied: wcwidth in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from prompt-toolkit<3.0.29,>=3.0->rasa) (0.2.5) Collecting types-cryptography>=3.3.21 Downloading types_cryptography-3.3.23-py3-none-any.whl (30 kB) Requirement already satisfied: cryptography>=3.3.1 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from PyJWT[crypto]<3.0.0,>=2.0.0->rasa) (37.0.4) Collecting docopt>=0.6.2 Downloading docopt-0.6.2.tar.gz (25 kB) Preparing metadata (setup.py) ... done Collecting dnspython<2.0.0,>=1.16.0 Downloading dnspython-1.16.0-py2.py3-none-any.whl (188 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 188.4/188.4 kB 204.2 kB/s eta 0:00:00 Collecting bidict>=0.21.0 Downloading bidict-0.22.0-py3-none-any.whl (36 kB) Collecting fire Downloading fire-0.4.0.tar.gz (87 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 87.7/87.7 kB 207.5 kB/s eta 0:00:00 Preparing metadata (setup.py) ... done Requirement already satisfied: urllib3<2.0.0,>=1.26.5 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from rasa-sdk<3.3.0,>=3.2.0->rasa) (1.26.11) Requirement already satisfied: charset-normalizer<3,>=2 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from requests<3.0,>=2.23->rasa) (2.1.1) Requirement already satisfied: certifi>=2017.4.17 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from requests<3.0,>=2.23->rasa) (2022.9.14) Requirement already satisfied: idna<4,>=2.5 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from requests<3.0,>=2.23->rasa) (3.3) Collecting ruamel.yaml.clib>=0.1.2 Downloading ruamel.yaml.clib-0.2.6-cp38-cp38-manylinux1_x86_64.whl (570 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 570.4/570.4 kB 335.9 kB/s eta 0:00:00 Collecting aiofiles>=0.6.0 Downloading aiofiles-22.1.0-py3-none-any.whl (14 kB) Collecting uvloop>=0.5.3 Downloading uvloop-0.17.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.6/4.6 MB 302.9 kB/s eta 0:00:00 Collecting httptools>=0.0.10 Downloading httptools-0.5.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (427 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 427.8/427.8 kB 290.8 kB/s eta 0:00:00 Collecting websockets>=10.0 Downloading websockets-10.3-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (111 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 111.3/111.3 kB 334.3 kB/s eta 0:00:00 Collecting multidict<7.0,>=4.5 Downloading multidict-5.2.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (187 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 187.4/187.4 kB 326.6 kB/s eta 0:00:00 Collecting threadpoolctl>=2.0.0 Downloading threadpoolctl-3.1.0-py3-none-any.whl (14 kB) Collecting python-crfsuite>=0.8.3 Downloading python_crfsuite-0.9.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.0/1.0 MB 326.0 kB/s eta 0:00:00 Collecting tabulate Downloading tabulate-0.8.10-py3-none-any.whl (29 kB) Collecting greenlet!=0.4.17 Downloading greenlet-1.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (157 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 157.1/157.1 kB 356.2 kB/s eta 0:00:00 Collecting opt-einsum>=2.3.2 Downloading opt_einsum-3.3.0-py3-none-any.whl (65 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 65.5/65.5 kB 318.3 kB/s eta 0:00:00 Collecting gast<0.5.0,>=0.2.1 Downloading gast-0.4.0-py3-none-any.whl (9.8 kB) Collecting google-pasta>=0.1.1 Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.5/57.5 kB 221.1 kB/s eta 0:00:00 Collecting keras-preprocessing>=1.1.1 Downloading Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 42.6/42.6 kB 278.5 kB/s eta 0:00:00 Collecting astunparse>=1.6.0 Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB) Collecting flatbuffers<3.0,>=1.12 Downloading flatbuffers-2.0.7-py2.py3-none-any.whl (26 kB) Collecting libclang>=9.0.1 Downloading libclang-14.0.6-py2.py3-none-manylinux2010_x86_64.whl (14.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.1/14.1 MB 276.7 kB/s eta 0:00:00 Collecting wrapt>=1.11.0 Downloading wrapt-1.14.1-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (81 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 81.0/81.0 kB 400.8 kB/s eta 0:00:00 Collecting tensorboard~=2.6 Downloading tensorboard-2.10.0-py3-none-any.whl (5.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 339.5 kB/s eta 0:00:00 Collecting tensorflow-estimator<2.8,~=2.7.0rc0 Downloading tensorflow_estimator-2.7.0-py2.py3-none-any.whl (463 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 463.1/463.1 kB 202.3 kB/s eta 0:00:00 Collecting termcolor>=1.1.0 Downloading termcolor-2.0.1-py3-none-any.whl (5.4 kB) Requirement already satisfied: wheel<1.0,>=0.32.0 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from tensorflow<2.8.0,>=2.7.0->rasa) (0.37.1) Collecting protobuf<3.20,>=3.9.2 Downloading protobuf-3.19.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.1/1.1 MB 208.1 kB/s eta 0:00:00 Collecting keras<2.8,>=2.7.0rc0 Downloading keras-2.7.0-py2.py3-none-any.whl (1.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 323.8 kB/s eta 0:00:00 Collecting grpcio<2.0,>=1.24.3 Downloading grpcio-1.49.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.7 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.7/4.7 MB 265.6 kB/s eta 0:00:00 Collecting tensorflow-io-gcs-filesystem>=0.21.0 Downloading tensorflow_io_gcs_filesystem-0.27.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.4/2.4 MB 198.4 kB/s eta 0:00:00 Collecting h5py>=2.9.0 Downloading h5py-3.7.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (4.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.5/4.5 MB 192.7 kB/s eta 0:00:00 Collecting typeguard>=2.7 Downloading typeguard-2.13.3-py3-none-any.whl (17 kB) Collecting requests-toolbelt Downloading requests_toolbelt-0.9.1-py2.py3-none-any.whl (54 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.3/54.3 kB 288.5 kB/s eta 0:00:00 Collecting future Downloading future-0.18.2.tar.gz (829 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 829.2/829.2 kB 297.7 kB/s eta 0:00:00 Preparing metadata (setup.py) ... done Collecting pamqp==3.2.1 Downloading pamqp-3.2.1-py2.py3-none-any.whl (33 kB) Requirement already satisfied: cffi>=1.12 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from cryptography>=3.3.1->PyJWT[crypto]<3.0.0,>=2.0.0->rasa) (1.15.1) Requirement already satisfied: zipp>=3.1.0 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from importlib-resources>=1.4.0->jsonschema<4.5,>=3.2->rasa) (3.8.1) Collecting locket Downloading locket-1.0.0-py2.py3-none-any.whl (4.4 kB) Collecting pyasn1<0.5.0,>=0.4.6 Downloading pyasn1-0.4.8-py2.py3-none-any.whl (77 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 77.1/77.1 kB 306.6 kB/s eta 0:00:00 Collecting tensorboard-data-server<0.7.0,>=0.6.0 Downloading tensorboard_data_server-0.6.1-py3-none-manylinux2010_x86_64.whl (4.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.9/4.9 MB 304.1 kB/s eta 0:00:00 Collecting markdown>=2.6.8 Downloading Markdown-3.4.1-py3-none-any.whl (93 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 93.3/93.3 kB 325.5 kB/s eta 0:00:00 Collecting werkzeug>=1.0.1 Downloading Werkzeug-2.2.2-py3-none-any.whl (232 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 232.7/232.7 kB 296.1 kB/s eta 0:00:00 Collecting google-auth-oauthlib<0.5,>=0.4.1 Downloading google_auth_oauthlib-0.4.6-py2.py3-none-any.whl (18 kB) Collecting tensorboard-plugin-wit>=1.6.0 Downloading tensorboard_plugin_wit-1.8.1-py3-none-any.whl (781 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 781.3/781.3 kB 286.4 kB/s eta 0:00:00 Requirement already satisfied: pycparser in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from cffi>=1.12->cryptography>=3.3.1->PyJWT[crypto]<3.0.0,>=2.0.0->rasa) (2.21) Collecting requests-oauthlib>=0.7.0 Downloading requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB) Requirement already satisfied: importlib-metadata>=4.4 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from markdown>=2.6.8->tensorboard~=2.6->tensorflow<2.8.0,>=2.7.0->rasa) (4.11.4) Requirement already satisfied: MarkupSafe>=2.1.1 in /home/ashish/anaconda3/envs/rasa_py38/lib/python3.8/site-packages (from werkzeug>=1.0.1->tensorboard~=2.6->tensorflow<2.8.0,>=2.7.0->rasa) (2.1.1) Collecting oauthlib>=3.0.0 Downloading oauthlib-3.2.1-py3-none-any.whl (151 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 151.7/151.7 kB 301.9 kB/s eta 0:00:00 Building wheels for collected packages: aio-pika, mattermostwrapper, pyTelegramBotAPI, randomname, twilio, docopt, fire, future Building wheel for aio-pika (setup.py) ... done Created wheel for aio-pika: filename=aio_pika-8.2.1-py3-none-any.whl size=49595 sha256=61819ae86e93deb5d8f1f7c60f769d40e42ccf6de8af62f4af0a3bcb30a98499 Stored in directory: /home/ashish/.cache/pip/wheels/27/ab/59/a3f84ec019a2a87ad812a2dc15945f763e5684aecfb059d9ec Building wheel for mattermostwrapper (setup.py) ... done Created wheel for mattermostwrapper: filename=mattermostwrapper-2.2-py3-none-any.whl size=2448 sha256=671230d2ec1c13e72ee8b2f53cd09e8a800d5db87dc166a6c2eb6ac019a89a7c Stored in directory: /home/ashish/.cache/pip/wheels/1a/d3/89/63aef88b581e7acc2c48812e6160a2bae57b6ef180f6e1f293 Building wheel for pyTelegramBotAPI (setup.py) ... done Created wheel for pyTelegramBotAPI: filename=pyTelegramBotAPI-4.7.0-py3-none-any.whl size=192810 sha256=388f48b689f2b9f8f5ab4253a9a1a0c0caba6a9869dacffe2f60077cab1a69ea Stored in directory: /home/ashish/.cache/pip/wheels/f7/29/1d/113c046ac93c2896159bb2c33673efe140b3642f21a06c4ac5 Building wheel for randomname (setup.py) ... done Created wheel for randomname: filename=randomname-0.1.5-py3-none-any.whl size=58808 sha256=b4ef46a33c316f83d0947ae923af542f7963176dffcd821789ce9af5b5d6393e Stored in directory: /home/ashish/.cache/pip/wheels/46/0a/9d/32f2d10d4fae6ddfbdeb504d949ccd70d666382a277e7c1bd5 Building wheel for twilio (setup.py) ... done Created wheel for twilio: filename=twilio-6.50.1-py2.py3-none-any.whl size=1208682 sha256=bc11ed67ebb0cd32e27e90ea4db4232206f01c9c0ed5909773b04b4b9e20391d Stored in directory: /home/ashish/.cache/pip/wheels/85/db/cc/c5c9ef20439073ba2a5403f5ce292446d3a01547007927df87 Building wheel for docopt (setup.py) ... done Created wheel for docopt: filename=docopt-0.6.2-py2.py3-none-any.whl size=13706 sha256=659b1afa51fb6f8497374a399b9e1c36a423a1fead4fa7999ccd1440917d65c2 Stored in directory: /home/ashish/.cache/pip/wheels/56/ea/58/ead137b087d9e326852a851351d1debf4ada529b6ac0ec4e8c Building wheel for fire (setup.py) ... done Created wheel for fire: filename=fire-0.4.0-py2.py3-none-any.whl size=115926 sha256=ecb49e817db0442d3d91f76c263f923d22f9c308f2fb6b2f170a80b3ceef5ac0 Stored in directory: /home/ashish/.cache/pip/wheels/1f/10/06/2a990ee4d73a8479fe2922445e8a876d38cfbfed052284c6a1 Building wheel for future (setup.py) ... done Created wheel for future: filename=future-0.18.2-py3-none-any.whl size=491058 sha256=cca1f0c0a3dfd4ada8c1534891271c9122c30f91097627fc9495f64eed6ae3d1 Stored in directory: /home/ashish/.cache/pip/wheels/8e/70/28/3d6ccd6e315f65f245da085482a2e1c7d14b90b30f239e2cf4 Successfully built aio-pika mattermostwrapper pyTelegramBotAPI randomname twilio docopt fire future Installing collected packages: typing-extensions, types-cryptography, tensorflow-estimator, tensorboard-plugin-wit, sanic-routing, pytz, python-crfsuite, pyasn1, msgpack, libclang, keras, kafka-python, flatbuffers, docopt, chardet, wrapt, werkzeug, websockets, uvloop, ujson, tzlocal, typing-utils, typeguard, toolz, threadpoolctl, terminaltables, termcolor, tensorflow-io-gcs-filesystem, tensorboard-data-server, tarsafe, tabulate, sentry-sdk, ruamel.yaml.clib, rsa, regex, redis, pyyaml, python-engineio, pymongo, PyJWT, pydot, pyasn1-modules, psycopg2-binary, protobuf, prompt-toolkit, pillow, pamqp, packaging, oauthlib, numpy, networkx, multidict, locket, kiwisolver, jsonpickle, joblib, jmespath, humanfriendly, httptools, grpcio, greenlet, google-pasta, gast, future, fsspec, dnspython, cycler, colorhash, colorclass, cloudpickle, cachetools, bidict, attrs, async-timeout, astunparse, aiofiles, absl-py, yarl, twilio, tensorflow_hub, tensorflow-addons, SQLAlchemy, sklearn-crfsuite, scipy, sanic-jwt, sanic, ruamel.yaml, rocketchat_API, requests-toolbelt, requests-oauthlib, questionary, python-socketio, pyTelegramBotAPI, partd, opt-einsum, mattermostwrapper, matplotlib, markdown, keras-preprocessing, jsonschema, h5py, google-auth, fire, fbmessenger, coloredlogs, CacheControl, botocore, apscheduler, webexteamssdk, scikit-learn, sanic-cors, s3transfer, randomname, pykwalify, google-auth-oauthlib, dask, aiormq, aiohttp, tensorboard, slackclient, rasa-sdk, boto3, aio-pika, tensorflow, tensorflow-text, rasa WARNING: The script chardetect is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script tabulate is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts pyrsa-decrypt, pyrsa-encrypt, pyrsa-keygen, pyrsa-priv2pub, pyrsa-sign and pyrsa-verify are installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts f2py, f2py3 and f2py3.8 are installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script humanfriendly is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts futurize and pasteurize are installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts make_image_classifier and make_nearest_neighbour_index are installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script sanic is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script markdown_py is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script jsonschema is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script coloredlogs is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script doesitcache is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script randomname is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script pykwalify is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script google-oauthlib-tool is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script tensorboard is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The scripts estimator_ckpt_converter, import_pb_to_tensorboard, saved_model_cli, tensorboard, tf_upgrade_v2, tflite_convert, toco and toco_from_protos are installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. WARNING: The script rasa is installed in '/home/ashish/.local/bin' which is not on PATH. Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location. Successfully installed CacheControl-0.12.11 PyJWT-2.5.0 SQLAlchemy-1.4.41 absl-py-0.13.0 aio-pika-8.2.1 aiofiles-22.1.0 aiohttp-3.7.4 aiormq-6.4.2 apscheduler-3.7.0 astunparse-1.6.3 async-timeout-3.0.1 attrs-21.2.0 bidict-0.22.0 boto3-1.24.75 botocore-1.27.75 cachetools-4.2.4 chardet-3.0.4 cloudpickle-1.6.0 colorclass-2.2.2 coloredlogs-15.0.1 colorhash-1.0.4 cycler-0.11.0 dask-2022.2.0 dnspython-1.16.0 docopt-0.6.2 fbmessenger-6.0.0 fire-0.4.0 flatbuffers-2.0.7 fsspec-2022.8.2 future-0.18.2 gast-0.4.0 google-auth-1.35.0 google-auth-oauthlib-0.4.6 google-pasta-0.2.0 greenlet-1.1.3 grpcio-1.49.0 h5py-3.7.0 httptools-0.5.0 humanfriendly-10.0 jmespath-1.0.1 joblib-1.0.1 jsonpickle-2.0.0 jsonschema-4.4.0 kafka-python-2.0.2 keras-2.7.0 keras-preprocessing-1.1.2 kiwisolver-1.4.4 libclang-14.0.6 locket-1.0.0 markdown-3.4.1 matplotlib-3.3.4 mattermostwrapper-2.2 msgpack-1.0.4 multidict-5.2.0 networkx-2.6.3 numpy-1.19.5 oauthlib-3.2.1 opt-einsum-3.3.0 packaging-20.9 pamqp-3.2.1 partd-1.3.0 pillow-9.2.0 prompt-toolkit-3.0.28 protobuf-3.19.5 psycopg2-binary-2.9.3 pyTelegramBotAPI-4.7.0 pyasn1-0.4.8 pyasn1-modules-0.2.8 pydot-1.4.2 pykwalify-1.8.0 pymongo-3.10.1 python-crfsuite-0.9.8 python-engineio-4.3.4 python-socketio-5.7.1 pytz-2021.3 pyyaml-6.0 questionary-1.10.0 randomname-0.1.5 rasa-3.2.8 rasa-sdk-3.2.1 redis-3.5.3 regex-2022.4.24 requests-oauthlib-1.3.1 requests-toolbelt-0.9.1 rocketchat_API-1.25.0 rsa-4.9 ruamel.yaml-0.16.13 ruamel.yaml.clib-0.2.6 s3transfer-0.6.0 sanic-21.12.2 sanic-cors-2.0.1 sanic-jwt-1.8.0 sanic-routing-0.7.2 scikit-learn-0.24.2 scipy-1.7.3 sentry-sdk-1.3.1 sklearn-crfsuite-0.3.6 slackclient-2.9.4 tabulate-0.8.10 tarsafe-0.0.3 tensorboard-2.10.0 tensorboard-data-server-0.6.1 tensorboard-plugin-wit-1.8.1 tensorflow-2.7.4 tensorflow-addons-0.15.0 tensorflow-estimator-2.7.0 tensorflow-io-gcs-filesystem-0.27.0 tensorflow-text-2.7.3 tensorflow_hub-0.12.0 termcolor-2.0.1 terminaltables-3.1.10 threadpoolctl-3.1.0 toolz-0.12.0 twilio-6.50.1 typeguard-2.13.3 types-cryptography-3.3.23 typing-extensions-3.10.0.2 typing-utils-0.1.0 tzlocal-2.1 ujson-5.5.0 uvloop-0.17.0 webexteamssdk-1.6.1 websockets-10.3 werkzeug-2.2.2 wrapt-1.14.1 yarl-1.8.1 (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ pip show rasa Name: rasa Version: 3.2.8 Summary: Open source machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants Home-page: https://rasa.com Author: Rasa Technologies GmbH Author-email: hi@rasa.com License: Apache-2.0 Location: /home/ashish/.local/lib/python3.8/site-packages Requires: absl-py, aio-pika, aiohttp, apscheduler, attrs, boto3, CacheControl, cloudpickle, colorclass, coloredlogs, colorhash, dask, fbmessenger, google-auth, joblib, jsonpickle, jsonschema, kafka-python, matplotlib, mattermostwrapper, networkx, numpy, packaging, prompt-toolkit, psycopg2-binary, pydot, PyJWT, pykwalify, pymongo, pyTelegramBotAPI, python-dateutil, python-engineio, python-socketio, pytz, questionary, randomname, rasa-sdk, redis, regex, requests, rocketchat_API, ruamel.yaml, sanic, sanic-cors, sanic-jwt, sanic-routing, scikit-learn, scipy, sentry-sdk, setuptools, sklearn-crfsuite, slackclient, SQLAlchemy, tarsafe, tensorflow, tensorflow-addons, tensorflow-text, tensorflow_hub, terminaltables, tqdm, twilio, typing-extensions, typing-utils, ujson, webexteamssdk Required-by:

Fix the PATH variable to enable RASA CLI

(rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ rasa rasa: command not found (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ sudo nano ~/.bashrc [sudo] password for ashish: (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ source ~/.bashrc (base) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ conda activate rasa_py38 (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ rasa usage: rasa [-h] [--version] {init,run,shell,train,interactive,telemetry,test,visualize,data,export,x,evaluate} ... Rasa command line interface. Rasa allows you to build your own conversational assistants 🤖. The 'rasa' command allows you to easily run most common commands like creating a new bot, training or evaluating models. positional arguments: {init,run,shell,train,interactive,telemetry,test,visualize,data,export,x,evaluate} Rasa commands init Creates a new project, with example training data, actions, and config files. run Starts a Rasa server with your trained model. shell Loads your trained model and lets you talk to your assistant on the command line. train Trains a Rasa model using your NLU data and stories. interactive Starts an interactive learning session to create new training data for a Rasa model by chatting. telemetry Configuration of Rasa Open Source telemetry reporting. test Tests Rasa models using your test NLU data and stories. visualize Visualize stories. data Utils for the Rasa training files. export Export conversations using an event broker. evaluate Tools for evaluating models. optional arguments: -h, --help show this help message and exit --version Print installed Rasa version

The Fix of PATH Variable

(rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$ echo $PATH /home/ashish/.local/bin:/home/ashish/anaconda3/envs/rasa_py38/bin:/home/ashish/anaconda3/condabin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin (rasa_py38) ashish@ashish-Lenovo-ideapad-130-15IKB:~/Desktop$
Tags: Technology,Python,Anaconda,Rasa,Natural Language Processing