Saturday, March 12, 2022

Alphabets (Auto-reader)

        
Tags: English Lessons,Communication Skills,

Friday, March 11, 2022

The Soviet Cauldron (Joke)

There is an old Soviet joke. An American dies and goes to hell. Satan himself shows him around. They pass a large cauldron. The American peers in. It’s full of suffering souls, burning in hot pitch. As they struggle to leave the pot, low-ranking devils, sitting on the rim, pitchfork them back in. The American is properly shocked. Satan says, “That’s where we put sinful Englishmen.” 

The tour continues. Soon the duo approaches a second cauldron. It’s slightly larger, and slightly hotter. The American peers in. It is also full of suffering souls, all wearing berets. Devils are pitchforking wouldbe escapees back into this cauldron, as well. “That’s where we put sinful Frenchmen,” Satan says. 

In the distance is a third cauldron. It’s much bigger, and is glowing, white hot. The American can barely get near it. Nonetheless, at Satan’s insistence, he approaches it and peers in. It is absolutely packed with souls, barely visible, under the surface of the boiling liquid. Now and then, however, one clambers out of the pitch and desperately reaches for the rim. Oddly, there are no devils sitting on the edge of this giant pot, but the clamberer disappears back under the surface anyway. 

The American asks, “Why are there no demons here to keep everyone from escaping?” Satan replies, “This is where we put the Russians. If one tries to escape, the others pull him back in.”


The Joke is About: Gulag in Soviet Union

The Gulag was a system of Soviet labour camps and accompanying detention and transit camps and prisons. From the 1920s to the mid-1950s it housed political prisoners and criminals of the Soviet Union. At its height, the Gulag imprisoned millions of people. Key People: Aleksandr Isayevich Solzhenitsyn Date: 1930 - 1955 Related Places: Russia Soviet Union
Tags: Joke,Management,Politics,

Wednesday, March 9, 2022

Interpretation of output from Weka for Apriori Algorithm


Our Dataset:

Best rules found in Weka using Apriori:

1. item5=t 2 ==> item1=t 2 <conf:(1)> lift:(1.5) lev:(0.07) [0] conv:(0.67) 2. item4=t 2 ==> item2=t 2 <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44) 3. item5=t 2 ==> item2=t 2 <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44) 4. item2=t item5=t 2 ==> item1=t 2 <conf:(1)> lift:(1.5) lev:(0.07) [0] conv:(0.67) 5. item1=t item5=t 2 ==> item2=t 2 <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44) 6. item5=t 2 ==> item1=t item2=t 2 <conf:(1)> lift:(2.25) lev:(0.12) [1] conv:(1.11) 7. item1=t item4=t 1 ==> item2=t 1 <conf:(1)> lift:(1.29) lev:(0.02) [0] conv:(0.22) 8. item3=t item5=t 1 ==> item1=t 1 <conf:(1)> lift:(1.5) lev:(0.04) [0] conv:(0.33) 9. item3=t item5=t 1 ==> item2=t 1 <conf:(1)> lift:(1.29) lev:(0.02) [0] conv:(0.22) 10. item2=t item3=t item5=t 1 ==> item1=t 1 <conf:(1)> lift:(1.5) lev:(0.04) [0] conv:(0.33)

Rule 1: item5=t 2 ==> item1=t 2 <conf:(1)> lift:(1.5) lev:(0.07) [0] conv:(0.67)

INTERPRETATION OF RULES:

item5=t 2

Meaning: item5 is in two transactions.

item1=t 2

Meaning: item1 is in two transactions. Confidence(X -> Y) = P(Y | X) = (# txn with both X and Y) / (# txn with X) Lift = P(A and B) / (P(A) * P(B)) A -> item5 B -> item1 = (2/9) / ((2/9) * (6/9)) = 1.5 Conviction(X -> Y) = (1 - supp(Y)) / (1 - conf(X -> Y)) Support(X) => Support(item5) = 2/9 Support(Y) => Support(item1) = 6/9 Support(x -> Y) = 2 / 9 Confidence(item5 -> item1) = 1 Conviction(X -> Y) = (1 - (6/9)) / (1 - (1)) = Division by zero error Coverage (also called cover or LHS-support) is the support of the left-hand-side of the rule X => Y, i.e., supp(X). It represents a measure of to how often the rule can be applied. Coverage can be quickly calculated from the rule's quality measures (support and confidence) stored in the quality slot. Leverage computes the difference between the observed frequency of A and C appearing together and the frequency that would be expected if A and C were independent. A leverage value of 0 indicates independence. Leverage(A -> C) = support(A -> C) - support(A) * support(C) Range: [-1,1] Leverage(item5 -> item1) = support(item5 -> item1) - support(item5)*support(item5) Leverage(item5 -> item1) = (2/9) - (2/9)*(6/9) = 0.074

Rule 2: item4=t 2 ==> item2=t 2 <conf:(1)> lift:(1.29) lev:(0.05) [0] conv:(0.44)

Confidence(X -> Y) = P(Y | X) (# of txns with Y and X) / (# of txns with X) X = item4 Y = item2 (# of txns with Y and X) = 2 (# of txns with item4) = 2 Confidence = 2/2 = 1 Lift = P(A and B) / (P(A) * P(B)) A = item4 B = item2 P(A and B) = (2/9) / ((2/9) * (7/9)) (P(A) * P(B)) = (2/9) * (7/9) Lift = (2/9) / ((2/9) * (7/9)) = 1.285

Rule 10: item2=t item3=t item5=t 1 ==> item1=t 1 <conf:(1)> lift:(1.5) lev:(0.04) [0] conv:(0.33)

item2=t item3=t item5=t 1

Meaning: Items 2, 3 and 5 are appearing together in one transaction.

item2=t item3=t item5=t 1 ==> item1=t 1

Meaning: Items 1, 2, 3 and 5 are appearing together in one transaction. Confidence(X -> Y) = P(Y | X) RHS = (# txn with both X and Y) / (# txn with X) = 1 / 1 = 1 Lift = P(A and B) / (P(A) * P(B)) = (1/9) / ((1/9) * (6/9)) = 1.5 Leverage(A -> C) = support(A -> C) - support(A) * support(C) support(Items 2, 3, 5) = 1/9 support(Item 1) = 6/9 support(Items 1, 2, 3, 5) = 1/9 Leverage(A -> C) = (1/9) - ((1/9) * (6/9)) = 0.037
Tags: Technology,Machine Learning,Weka,

Tuesday, March 8, 2022

Alphabets (Read it loud letter by letter)



           

Tags: English Lessons,Communication Skills,

Monday, March 7, 2022

Alphabets Carousel

        

/

Tags: English Lessons,Communication Skills,

Anomalies in 'survival8' Viewers' Stats (Mar 2022)

Anomaly 1: 10K Views on a single day (on 2021-Oct-29)

Anomaly 2: of Unknown Region (Noted on: 2022-Mar-7)

Tags: Technology,Machine Learning,