Monday, March 7, 2022

Running Weka Apriori on 9_TXN_5_ITEMS Dataset

A CSV file not following Weka format of questions marks failed to load.

Error Message:
Full Screen:

The File Erroneous For Weka is opening without any issues in LibreOffice:

So, we create our custom file in similar manner to Weka's Supermarket dataset:
tid,item1,item2,item3,item4,item5
T100,t,t,?,?,t
T200,?,t,?,t,?
T300,?,t,t,?,?
T400,t,t,?,t,?
T500,t,?,t,?,?
T600,?,t,t,?,?
T700,t,?,t,?,?
T800,t,t,t,?,t
T900,t,t,t,?,?

Weka's Apriori Run Information For Small Dataset As Above With TID

=== Run information ===

Scheme:       weka.associations.Apriori -N 10 -T 0 -C 0.9 -D 0.05 -U 1.0 -M 0.1 -S -1.0 -c -1
Relation:     9_txn_5_items
Instances:    9
Attributes:   6
                tid
                item1
                item2
                item3
                item4
                item5
=== Associator model (full training set) ===


Apriori
=======

Minimum support: 0.16 (1 instances)
Minimum metric <confidence>: 0.9
Number of cycles performed: 17

Generated sets of large itemsets:

Size of set of large itemsets L(1): 14

Size of set of large itemsets L(2): 31

Size of set of large itemsets L(3): 25

Size of set of large itemsets L(4): 8

Size of set of large itemsets L(5): 1

Best rules found:

1. item5=t 2 ==> item1=t 2    <conf:(1)> lift:(1.5) lev:(0.07) [0] conv:(0.67)
2. item4=t 2 ==> item2=t 2    <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44)
3. item5=t 2 ==> item2=t 2    <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44)
4. item2=t item5=t 2 ==> item1=t 2    <conf:(1)> lift:(1.5) lev:(0.07) [0] conv:(0.67)
5. item1=t item5=t 2 ==> item2=t 2    <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44)
6. item5=t 2 ==> item1=t item2=t 2    <conf:(1)> lift:(2.25) lev:(0.12) [1] conv:(1.11)
7. tid=T100 1 ==> item1=t 1    <conf:(1)> lift:(1.5) lev:(0.04) [0] conv:(0.33)
8. tid=T100 1 ==> item2=t 1    <conf:(1)> lift:(1.29) lev:(0.02) [0] conv:(0.22)
9. tid=T100 1 ==> item5=t 1    <conf:(1)> lift:(4.5) lev:(0.09) [0] conv:(0.78)
10. tid=T200 1 ==> item2=t 1    <conf:(1)> lift:(1.29) lev:(0.02) [0] conv:(0.22) 

We run the Apriori again without TID column this time:

Logs from Weka:

=== Run information ===

Scheme:       weka.associations.Apriori -N 10 -T 0 -C 0.9 -D 0.05 -U 1.0 -M 0.1 -S -1.0 -c -1
Relation:     9_txn_5_items_without_tid
Instances:    9
Attributes:   5
                item1
                item2
                item3
                item4
                item5
=== Associator model (full training set) ===


Apriori
=======

Minimum support: 0.16 (1 instances)
Minimum metric <confidence>: 0.9
Number of cycles performed: 17

Generated sets of large itemsets:

Size of set of large itemsets L(1): 5

Size of set of large itemsets L(2): 8

Size of set of large itemsets L(3): 5

Size of set of large itemsets L(4): 1

Best rules found:

1. item5=t 2 ==> item1=t 2    <conf:(1)> lift:(1.5) lev:(0.07) [0] conv:(0.67)
2. item4=t 2 ==> item2=t 2    <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44)
3. item5=t 2 ==> item2=t 2    <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44)
4. item2=t item5=t 2 ==> item1=t 2    <conf:(1)> lift:(1.5) lev:(0.07) [0] conv:(0.67)
5. item1=t item5=t 2 ==> item2=t 2    <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44)
6. item5=t 2 ==> item1=t item2=t 2    <conf:(1)> lift:(2.25) lev:(0.12) [1] conv:(1.11)
7. item1=t item4=t 1 ==> item2=t 1    <conf:(1)> lift:(1.29) lev:(0.02) [0] conv:(0.22)
8. item3=t item5=t 1 ==> item1=t 1    <conf:(1)> lift:(1.5) lev:(0.04) [0] conv:(0.33)
9. item3=t item5=t 1 ==> item2=t 1    <conf:(1)> lift:(1.29) lev:(0.02) [0] conv:(0.22)
10. item2=t item3=t item5=t 1 ==> item1=t 1    <conf:(1)> lift:(1.5) lev:(0.04) [0] conv:(0.33)
Tags: Technology,Machine Learning,Weka,

Apriori Algorithm For Association Mining Using Weka's Supermarket Dataset

We will see that the 'Supermarket.arff' dataset from Weka Repository is a: Fixed columns, True-False Format
@relation supermarket
@attribute 'department1' { t}
@attribute 'department2' { t}
@attribute 'department3' { t}
@attribute 'department4' { t}
@attribute 'department5' { t}
@attribute 'department6' { t}
@attribute 'department7' { t}
@attribute 'department8' { t}
@attribute 'department9' { t}
@attribute 'grocery misc' { t}
@attribute 'department11' { t}
@attribute 'baby needs' { t}
@attribute 'bread and cake' { t}
@attribute 'baking needs' { t}
@attribute 'coupons' { t}
@attribute 'juice-sat-cord-ms' { t}
@attribute 'tea' { t}
@attribute 'biscuits' { t}
@attribute 'canned fish-meat' { t}
...
... 
... 
@attribute 'department215' { t}
@attribute 'department216' { t}
@attribute 'total' { low, high} % low < 100
@data
?,?,?,?,?,?,?,?,?,?,?,t,t,t,?,t,?,t,?,?,t,?,?,?,t,t,t,t,?,t,?,t,t,?,?,?,?,?,?,t,t,t,?,?,?,?,?,?,?,t,?,?,?,?,?,?,?,?,t,?,t,?,?,t,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,high
t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,t,t,?,?,?,?,?,t,?,?,?,t,t,?,?,?,?,?,t,t,?,t,?,?,?,?,?,?,?,t,?,?,t,?,?,?,?,?,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,low
?,?,?,?,?,?,?,?,?,?,?,?,t,t,?,t,?,t,?,t,?,?,?,?,?,?,t,?,t,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,t,?,?,?,t,?,t,?,?,?,?,?,?,?,?,?,t,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,low
t,?,?,?,?,?,?,?,?,?,?,?,t,t,?,t,?,t,?,?,t,t,?,?,t,?,?,?,?,?,?,t,?,?,?,t,?,t,?,t,t,?,?,?,?,?,?,?,t,t,?,?,?,?,?,?,?,?,t,?,?,?,?,t,?,?,t,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,low
?,?,?,?,?,?,?,?,?,?,?,?,t,t,?,t,t,?,?,?,?,?,?,?,t,t,t,?,?,?,?,t,?,?,?,t,?,?,t,?,?,t,?,?,?,?,?,?,t,?,?,t,t,?,?,?,?,t,?,?,t,?,?,t,?,?,?,?,?,?,t,?,?,?,?,t,?,?,?,?,?,?,?,?,t,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,t,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,?,low
... 
... 
... 

Error We Encountered in Weka When Loading "Sparse Matrix, Varying Columns Format" Input Format Dataset

IMG 1

IMG 2

Apriori Output in Weka For Supermarket Data

=== Run information ===

Scheme:       weka.associations.Apriori -N 10 -T 0 -C 0.9 -D 0.05 -U 1.0 -M 0.1 -S -1.0 -c -1
Relation:     supermarket
Instances:    4627
Attributes:   217
              [list of attributes omitted]
=== Associator model (full training set) ===


Apriori
=======

Minimum support: 0.15 (694 instances)
Minimum metric <confidence>: 0.9
Number of cycles performed: 17

Generated sets of large itemsets:

Size of set of large itemsets L(1): 44

Size of set of large itemsets L(2): 380

Size of set of large itemsets L(3): 910

Size of set of large itemsets L(4): 633

Size of set of large itemsets L(5): 105

Size of set of large itemsets L(6): 1

Best rules found:

 1. biscuits=t frozen foods=t fruit=t total=high 788 ==> bread and cake=t 723    <conf:(0.92)> lift:(1.27) lev:(0.03) [155] conv:(3.35)
 2. baking needs=t biscuits=t fruit=t total=high 760 ==> bread and cake=t 696    <conf:(0.92)> lift:(1.27) lev:(0.03) [149] conv:(3.28)
 3. baking needs=t frozen foods=t fruit=t total=high 770 ==> bread and cake=t 705    <conf:(0.92)> lift:(1.27) lev:(0.03) [150] conv:(3.27)
 4. biscuits=t fruit=t vegetables=t total=high 815 ==> bread and cake=t 746    <conf:(0.92)> lift:(1.27) lev:(0.03) [159] conv:(3.26)
 5. party snack foods=t fruit=t total=high 854 ==> bread and cake=t 779    <conf:(0.91)> lift:(1.27) lev:(0.04) [164] conv:(3.15)
 6. biscuits=t frozen foods=t vegetables=t total=high 797 ==> bread and cake=t 725    <conf:(0.91)> lift:(1.26) lev:(0.03) [151] conv:(3.06)
 7. baking needs=t biscuits=t vegetables=t total=high 772 ==> bread and cake=t 701    <conf:(0.91)> lift:(1.26) lev:(0.03) [145] conv:(3.01)
 8. biscuits=t fruit=t total=high 954 ==> bread and cake=t 866    <conf:(0.91)> lift:(1.26) lev:(0.04) [179] conv:(3)
 9. frozen foods=t fruit=t vegetables=t total=high 834 ==> bread and cake=t 757    <conf:(0.91)> lift:(1.26) lev:(0.03) [156] conv:(3)
10. frozen foods=t fruit=t total=high 969 ==> bread and cake=t 877    <conf:(0.91)> lift:(1.26) lev:(0.04) [179] conv:(2.92)  
Tags: Technology,Machine Learning,Weka

Three Types of Input Data Format For Apriori Algorithm

Input Data Format (1)

Two Column Format

order_id,product_id
2,33120
2,28985
2,9327
2,45918
2,30035
2,17794
2,40141
2,1819
2,43668
3,33754
3,24838
3,17704
3,21903
3,17668
3,46667
3,17461
3,32665
4,46842
4,26434
4,39758
4,27761
4,10054
4,21351
4,22598
4,34862
4,40285
4,17616
4,25146
4,32645
4,41276
5,13176
5,15005
5,47329
5,27966
5,23909
5,48370
5,13245
5,9633
5,27360

Input Data Format (2)

Fixed columns, True-False Format

,Apple,Bread,Butter,Cheese,Corn,Dill,Eggs,Ice cream,Kidney Beans,Milk,Nutmeg,Onion,Sugar,Unicorn,Yogurt,chocolate

0,False,True,False,False,True,True,False,True,False,False,False,False,True,False,True,True

1,False,False,False,False,False,False,False,False,False,True,False,False,False,False,False,False

2,True,False,True,False,False,True,False,True,False,True,False,False,False,False,True,True

3,False,False,True,True,False,True,False,False,False,True,True,True,False,False,False,False

4,True,True,False,False,False,False,False,False,False,False,False,False,False,False,False,False

Input Data Format (3)

Sparse Matrix, Varying Columns Format

shrimp,almonds,avocado,vegetables mix,green grapes,whole weat flour,yams,cottage cheese,energy drink,tomato juice,low fat yogurt,green tea,honey,salad,mineral water,salmon,antioxydant juice,frozen smoothie,spinach,olive oil

burgers,meatballs,eggs

chutney

turkey,avocado

mineral water,milk,energy bar,whole wheat rice,green tea

low fat yogurt

whole wheat pasta,french fries

soup,light cream,shallot

frozen vegetables,spaghetti,green tea

french fries

eggs,pet food

cookies

turkey,burgers,mineral water,eggs,cooking oil

spaghetti,champagne,cookies

mineral water,salmon
Tags: Technology,Machine Learning,

Sunday, March 6, 2022

Counting (Read it loud)



        

Tags: Mathematical Foundations for Data Science,

Linear Regression Using Java Code And Weka JAR

File: Regression.java

import weka.core.Instance; import weka.core.Instances; import weka.core.converters.ConverterUtils.DataSource; import weka.classifiers.functions.LinearRegression; public class Regression{ public static void main(String args[]) throws Exception{ //Load Data set DataSource source = new DataSource("/home/ashish/Desktop/ws/weka/e4_linear_regression_using_java_code/house.arff"); Instances dataset = source.getDataSet(); //set class index to the last attribute dataset.setClassIndex(dataset.numAttributes()-1); //Build model LinearRegression model = new LinearRegression(); model.buildClassifier(dataset); //output model System.out.println("LR FORMULA : "+model); // Now Predicting the cost Instance myHouse = dataset.lastInstance(); double price = model.classifyInstance(myHouse); System.out.println("-------------------------"); System.out.println("PRECTING THE PRICE : "+price); } }

File: house.arff

@RELATION house @ATTRIBUTE houseSize NUMERIC @ATTRIBUTE lotSize NUMERIC @ATTRIBUTE bedrooms NUMERIC @ATTRIBUTE granite NUMERIC @ATTRIBUTE bathroom NUMERIC @ATTRIBUTE sellingPrice NUMERIC @DATA 3529,9191,6,0,0,205000 3247,10061,5,1,1,224900 4032,10150,5,0,1,197900 2397,14156,4,1,0,189900 2200,9600,4,0,1,195000 3536,19994,6,1,1,325000 2983,9365,5,0,1,230000

File: Execution.log

~/Desktop/ws/weka/e4_linear_regression_using_java_code$ javac -cp ./weka-3.7.0.jar Regression.java ~/Desktop/ws/weka/e4_linear_regression_using_java_code$ ~/Desktop/ws/weka/e4_linear_regression_using_java_code$ ls -l total 5232 -rw-rw-r-- 1 ashish ashish 365 Mar 5 09:05 house.arff drwxrwxr-x 2 ashish ashish 4096 Mar 5 09:12 jar -rw-rw-r-- 1 ashish ashish 1714 Mar 5 09:24 Regression.class -rw-rw-r-- 1 ashish ashish 924 Mar 5 09:18 Regression.java -rw-rw-r-- 1 ashish ashish 5340945 Sep 27 2011 weka-3.7.0.jar ~/Desktop/ws/weka/e4_linear_regression_using_java_code$ java -cp .:./weka-3.7.0.jar Regression LR FORMULA : Linear Regression Model sellingPrice = -26.6882 * houseSize + 7.0551 * lotSize + 43166.0767 * bedrooms + 42292.0901 * bathroom + -21661.1208 ------------------------- PRECTING THE PRICE : 222921.57101904938 ~/Desktop/ws/weka/e4_linear_regression_using_java_code$ (base) ashish@ashish-VirtualBox:~/Desktop/ws/weka/e4_linear_regression_using_java_code$ ls -l total 13852 -rw-rw-r-- 1 ashish ashish 923 Mar 5 09:25 execution.log -rw-rw-r-- 1 ashish ashish 365 Mar 5 09:05 house.arff drwxrwxr-x 2 ashish ashish 4096 Mar 6 15:04 jar -rw-rw-r-- 1 ashish ashish 1714 Mar 5 09:24 Regression.class -rw-rw-r-- 1 ashish ashish 924 Mar 5 09:18 Regression.java -rwxrwxrwx 1 ashish ashish 14163929 Jan 25 16:06 weka-3.8.6.jar (base) ashish@ashish-VirtualBox:~/Desktop/ws/weka/e4_linear_regression_using_java_code$ java -cp .:./weka-3.8.6.jar Regression Exception in thread "main" java.lang.NoClassDefFoundError: no/uib/cipr/matrix/Matrix at Regression.main(Regression.java:15) Caused by: java.lang.ClassNotFoundException: no.uib.cipr.matrix.Matrix at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) ... 1 more (base) ashish@ashish-VirtualBox:~/Desktop/ws/weka/e4_linear_regression_using_java_code$ javac -cp .:./weka-3.8.6.jar Regression.java (base) ashish@ashish-VirtualBox:~/Desktop/ws/weka/e4_linear_regression_using_java_code$ java -cp .:./weka-3.8.6.jar Regression Exception in thread "main" java.lang.NoClassDefFoundError: no/uib/cipr/matrix/Matrix at Regression.main(Regression.java:15) Caused by: java.lang.ClassNotFoundException: no.uib.cipr.matrix.Matrix at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) ... 1 more - - - TRYING AGAIN WITH WEKA-3.7.0.JAR: (base) ashish@ashish-VirtualBox:~/Desktop/ws/weka/e4_linear_regression_using_java_code$ java -cp .:./weka-3.7.0.jar Regression LR FORMULA : Linear Regression Model sellingPrice = -26.6882 * houseSize + 7.0551 * lotSize + 43166.0767 * bedrooms + 42292.0901 * bathroom + -21661.1208 ------------------------- PRECTING THE PRICE : 222921.57101904938
Tags: Technology,Machine Learning,Regression

Weka clustering experiment on Iris dataset

1: Weka Explorer: Preprocess tab: Before clustering

2: Weka Explorer: Cluster tab: Ignore attribute

3: Weka Explorer: Cluster tab: Algo (Expectation Maximization) parameters

4: Weka Explorer: Cluster tab: Algo EM: Results

5: Weka Explorer: Cluster tab: kMeans algo: Parameters

6: Weka Explorer: Cluster tab: kMeans: results (A)

7: Weka Explorer: Cluster tab: kMeans: results (B)

Tags: Technology,Machine Learning,Clustering

Weka classification experiment on Iris dataset

1: Weka Explorer: Preprocess Tab: Iris dataset

2: Weka Experiment Environment: New experiment

3: Weka Experiment Environment: Selecting Naive Bayes' Classifier

4: Weka Experiment Environment: Selecting kNN for comparison with Naive Bayes' Classifier

5: Weka Experiment Environment: Go to 'Run' tab and click on 'start'

6: Weka Experiment Environment: Go to 'Analyze' tab and compare three algorithms

7: Weka Experiment Environment: Analyze tab: Perform test

Tags: Technology,Machine Learning,Classification,

Saturday, March 5, 2022

Sneha Kiran, Vijaya Marneni, and Dinesh Sawant

Index of Journals
The role that Sneha Kiran was playing in Mobileum when I had joined the company in May 2015 and till the time she left in May 2016 was of a UI/UX Expert and People Manager.

She wasn't just managing the Mobileum's internal team of GUI developers, but also the UI/UX designers' team and also the third party VibrantInfo's team of GUI developers.
My memories of interactions with Sneha were those of task allocations calls alongside just the team lead, or along side the entire team during the weekly team meetings over conference calls from the Mobileum's meeting rooms (that I used to take alongside Yajuvendra from Gurugram office).

At times, I felt she was protective of the team and, mixed and interacted with them well. Am saying this last statement because of the memories I have of project development managers like Rajesh Jindal and Prashant Saxena. In even under extreme pressure from these managers over mail with their directors (like Vishal Gandhi and Dinesh Sawant) in loop and at times, Shekhar (Vice President) also, Sneha wouldn't lose her composure, would take time in responding, apologize if there were unjust delays in response but deliverables would never get hurt and quality work was done.

And it wasn't just my project development managers, there were others too like QA managers: Kavita Surve (of NTR) and Somshekhar in Bengaluru office.

Sneha and I are friends till this day, at least I consider her as one.
When I told her I was working on a piece titled "Sneha Kiran, Vijaya Marneni and Dinesh Sawant", she said she wasn't in a very good phase of life. There I promised her that once I complete the post, I would first let her review it before putting it online and she replied with a 'thumps up'.
Now the thing that's going in my head is not memory that needs reminiscing. It is a question, rather a big one that can become a separate post in its own right.

The questions are:
Why did Sneha leave Mobileum?
Was her a case of "glass ceiling preventing promotion" or the case of "promotion as per Peter's principle"?

I think it was both. She could have taken up a better role with one of the design teams, or marketing teams or some other department if not engineering.

But that is not what happened, what happened was that Vijaya Marneni, an outsider, a techie, took the place of Sneha when she left, and held a designation of GUI Architect and Manager.

Vijaya did not stay in the job that was more of a charade, mockery of his Architect designation and more of a "follow-up task manager's job".

If I remember correctly, Vijaya left Mobileum by Jan 2017, and GUI team was disintegrated there after and individual UI developers were aligned with the Directors of the business verticals they were working in.
This was how I came under Dinesh Sawant.
I was not very happy with the new arrangement and I wrote a one-to-one mail to Shekhar when I received the first news of this new development. Or disintegration I should call it more appropriately.
Vijaya leaving the organisation doesn't look a surprise to me in the hind sight today. His people skills were not on par with Sneha and there was no architect work for him in GUI team.
I had asked Shekhar to not scatter the GUI team, give the then-team lead Apurba Das more leadership role and maybe also promote one of Senior Software Engineers (SSE) to share the load.
But my words were falling on deaf ears.
When I started under Dinesh Sawant from Jan 2017, I was rowing two boats. One was my job, another was my BITS Pilani, M Tech. Program.
But as the Russian proverb goes:
"Person rowing with feet in two boats, drowns."
I was relieved from job in Aug 2018 but by then, I, luckily, had finished my MTech and joined Infosys.

Thank you for reading.
See you in the next post.
Tags: Journal,Behavioral Science,Management,

Thursday, March 3, 2022

Saving Model, Loading Model and Making Predictions for Linear Regression (in Weka)

1: Weka Explorer. Preprocess Tab.

2: Weka Explorer. Classify Tab.

3: Weka Explorer. Visualize Tab

4: Weka Experiment. Setup Tab. Advanced Configuration.

5: Weka Experiment Environment. Analyze Tab

6: Weka Experiment Environment. Comparing ZeroR with Linear Regression.

7: Saving and Loading models from SimpleCLI (documentation)

8: Use of TAB key in Weka SimpleCLI (Doc)

9: Use of Tab key in Weka SimpleCLI (Demo)

10: Training and Saving Linear Regression model from SimpleCLI.

Weka by default, picks up the last column as the target. So, Weka on reading from our file considered 'Day Count' as dependent variable and 'Close Price' as independent variable for 'COALINDIA' ticker data.

11: Dataset Corrected For Column Ordering

12: Training and Saving Linear Regression model after Correction in Dataset

13: Error during prediction for having only one col instead of two

14: Correction in test.csv

15: Error during prediction (string is not numeric)

16: Load Previously Saved Model in The Weka Explorer: Classify Tab

17: Select test data and select output predictions format

18: Select our previously saved model

19: View our saved model (Linear Regression) configuration

20: Re-evaluate model on current test set

21: View Classifier Output with saved model and extrapolated test data

Tags: Technology,Machine Learning,FOSS

Pandas DataFrame Filtering Using eval()

import pandas as pd

df = pd.DataFrame({
    "col1": [1, 2, 3, 1, 2, 3],
    "col2": ["A", "B", "A", "B", "A", "B"]
})

c = r"(df['col1'] == 1) | (df['col2'] == 'A')" df[eval(c)]
df[df['col1'] == 1]
d = "df['col1'] == 1" df[eval(d)]
Tags: Technology,Machine Learning,Data Visualization,