Why the fuck do i keep failing in math

D

Deleted member 55855

Light triad hopecel
Joined
Dec 18, 2023
Posts
1,288
Reputation
831
I just got another F and im getting a D on my report card, im fucked
 
lock in and spend sometime on it if you wanna get good
 
  • +1
Reactions: wishIwasSalludon and user123456
  • +1
Reactions: wishIwasSalludon
dont show up on test day make an excuse that your sick or something, take a redo test day (usually teacher will put you in the hall or a quiet room while he teaches the lesson) and cheatmaxx with your phone. For math assignments just use AI and get high 90s. Just pretend your paying attention so you can ask for the teacher to bump your mark up. If your barely passing and theres an exam coming up pay someone to pull the fire alarm on exam day, in my school if anyone dies, has a seizure, gets badly sick, fire alarm, intruder or anything that will distract the class from taking the exam then it gets cancelled and wont end up failing you. Just lock in next year get info from students what are the easiest teachers, get a teacher that doesnt gaf if you cheat (blind old immigrant teachers usually work the best). Good luck
 
ignore above ape ^^^

exercise. if you get the basic of a concept you gotta exercise.
I used to look up exercices online on the topic we would study you can do that if you want but now you can use AI for that.

Here is an example of a conversation with GPT-4o that generted exercices that I do and then corrects me if i'm wrong so I can improve:

> generate about 15 exercices exercises of varying difficulty on: entropy calculation optimization (you will generate actual exercices that I need to complete in order to improve)


> Certainly! Here are 15 exercises on entropy calculation optimization, designed to help you improve your understanding and skills in this area. The exercises range from basic concepts to more advanced applications.

Beginner Level

  1. Basic Entropy Calculation:
    • Given a probability distribution [0.1,0.3,0.4,0.2][0.1, 0.3, 0.4, 0.2][0.1,0.3,0.4,0.2], calculate the Shannon entropy.
    • Verify your result using a Python function.
  2. Binary Entropy Function:
    • Write a Python function that takes a probability ppp and returns the entropy of a binary variable with probabilities ppp and 1−p1-p1−p.
    • Plot the entropy as ppp varies from 0 to 1.
  3. Entropy of Text:
    • Write a program to read a text file and compute the entropy of the characters in the file.
    • Compare the entropies of different text files.

Intermediate Level

  1. Joint Entropy:
    • Given the joint probability distribution of two discrete random variables(X,Y)=(0.10.20.10.10.30.2)P(X, Y) = \begin{pmatrix}0.1 & 0.2 & 0.1 \\0.1 & 0.3 & 0.2\end{pmatrix}P(X,Y)=(0.10.10.20.30.10.2)calculate the joint entropy H(X,Y)H(X, Y)H(X,Y).
  2. Conditional Entropy:
    • For the same joint distribution in Exercise 4, calculate the conditional entropy H(Y∣X)H(Y|X)H(Y∣X).
  3. Cross-Entropy Loss Implementation:
    • Implement the cross-entropy loss function for a binary classification problem.
    • Use your implementation to calculate the loss for a small dataset of your choice.
  4. Information Gain in Decision Trees:
    • Write a function to calculate the information gain for a given split in a decision tree.
    • Apply your function to a small dataset and determine the best split based on information gain.
  5. Kullback-Leibler Divergence:
    • Given two probability distributions P=[0.4,0.6]P = [0.4, 0.6]P=[0.4,0.6] and Q=[0.5,0.5]Q = [0.5, 0.5]Q=[0.5,0.5], calculate the Kullback-Leibler divergence DKL(P∣∣Q)D_{KL}(P||Q)DKL(P∣∣Q).

Advanced Level

  1. Maximum Entropy Distribution:
    • Given constraints that the mean of a distribution is 3 and the variance is 2, derive the maximum entropy distribution.
    • Implement a numerical method to verify your result.
  2. Entropy Rate of a Markov Chain:
    • Consider a Markov chain with states S={1,2}S = \{1, 2\}S={1,2} and transition matrix=(0.70.30.40.6)P = \begin{pmatrix}0.7 & 0.3 \\0.4 & 0.6\end{pmatrix}P=(0.70.40.30.6)Calculate the entropy rate of this Markov chain.
  3. Entropy in Image Compression:
    • Write a program to compute the entropy of an image's pixel intensity distribution.
    • Compare the entropy of different images and discuss the implications for image compression.
  4. Mutual Information:
    • Given the joint probability distribution(X,Y)=(0.10.30.20.4)P(X, Y) = \begin{pmatrix}0.1 & 0.3 \\0.2 & 0.4\end{pmatrix}P(X,Y)=(0.10.20.30.4)calculate the mutual information I(X;Y)I(X;Y)I(X;Y).
  5. Entropy Optimization in Neural Networks:
    • Implement an entropy-based regularization term in the loss function of a neural network.
    • Train the network on a small dataset and analyze the impact of this regularization on the model's performance.
  6. Differential Entropy Calculation:
    • For a continuous random variable with probability density function f(x)=e−xf(x) = e^{-x}f(x)=e−x for x≥0x \geq 0x≥0, calculate the differential entropy.
  7. Entropy-Based Clustering:
    • Implement an entropy-based method for clustering a dataset.
    • Apply your method to a synthetic dataset and evaluate the quality of the clusters.
These exercises should provide a comprehensive set of tasks to deepen your understanding and skills in entropy calculation and optimization.

as you can see it gives solid exercices so you can train whatever chapter you're on

keep doing that over and over you don't have to love math honestly i fucking hate math
 
  • +1
Reactions: Mr_Bombo_mogs and Deleted member 55855
keep doing that over and over you don't have to love math honestly i fucking hate math
Fr bro, seriously when the fuck am i gonna use quadratic equations💀
 
Fr bro, seriously when the fuck am i gonna use quadratic equations💀
in a lot of jobs actually but like with ai its useless now just get the robotic nigga to do it for u
 
in my school if anyone dies, has a seizure, gets badly sick, fire alarm, intruder or anything that will distract the class from taking the exam
Baldwin Pfp yet no compassion to others .Peak autism
 
  • +1
Reactions: Deleted member 55855
ignore above ape ^^^

exercise. if you get the basic of a concept you gotta exercise.
I used to look up exercices online on the topic we would study you can do that if you want but now you can use AI for that.

Here is an example of a conversation with GPT-4o that generted exercices that I do and then corrects me if i'm wrong so I can improve:

> generate about 15 exercices exercises of varying difficulty on: entropy calculation optimization (you will generate actual exercices that I need to complete in order to improve)


> Certainly! Here are 15 exercises on entropy calculation optimization, designed to help you improve your understanding and skills in this area. The exercises range from basic concepts to more advanced applications.

Beginner Level

  1. Basic Entropy Calculation:
    • Given a probability distribution [0.1,0.3,0.4,0.2][0.1, 0.3, 0.4, 0.2][0.1,0.3,0.4,0.2], calculate the Shannon entropy.
    • Verify your result using a Python function.
  2. Binary Entropy Function:
    • Write a Python function that takes a probability ppp and returns the entropy of a binary variable with probabilities ppp and 1−p1-p1−p.
    • Plot the entropy as ppp varies from 0 to 1.
  3. Entropy of Text:
    • Write a program to read a text file and compute the entropy of the characters in the file.
    • Compare the entropies of different text files.

Intermediate Level

  1. Joint Entropy:
    • Given the joint probability distribution of two discrete random variables(X,Y)=(0.10.20.10.10.30.2)P(X, Y) = \begin{pmatrix}0.1 & 0.2 & 0.1 \\0.1 & 0.3 & 0.2\end{pmatrix}P(X,Y)=(0.10.10.20.30.10.2)calculate the joint entropy H(X,Y)H(X, Y)H(X,Y).
  2. Conditional Entropy:
    • For the same joint distribution in Exercise 4, calculate the conditional entropy H(Y∣X)H(Y|X)H(Y∣X).
  3. Cross-Entropy Loss Implementation:
    • Implement the cross-entropy loss function for a binary classification problem.
    • Use your implementation to calculate the loss for a small dataset of your choice.
  4. Information Gain in Decision Trees:
    • Write a function to calculate the information gain for a given split in a decision tree.
    • Apply your function to a small dataset and determine the best split based on information gain.
  5. Kullback-Leibler Divergence:
    • Given two probability distributions P=[0.4,0.6]P = [0.4, 0.6]P=[0.4,0.6] and Q=[0.5,0.5]Q = [0.5, 0.5]Q=[0.5,0.5], calculate the Kullback-Leibler divergence DKL(P∣∣Q)D_{KL}(P||Q)DKL(P∣∣Q).

Advanced Level

  1. Maximum Entropy Distribution:
    • Given constraints that the mean of a distribution is 3 and the variance is 2, derive the maximum entropy distribution.
    • Implement a numerical method to verify your result.
  2. Entropy Rate of a Markov Chain:
    • Consider a Markov chain with states S={1,2}S = \{1, 2\}S={1,2} and transition matrix=(0.70.30.40.6)P = \begin{pmatrix}0.7 & 0.3 \\0.4 & 0.6\end{pmatrix}P=(0.70.40.30.6)Calculate the entropy rate of this Markov chain.
  3. Entropy in Image Compression:
    • Write a program to compute the entropy of an image's pixel intensity distribution.
    • Compare the entropy of different images and discuss the implications for image compression.
  4. Mutual Information:
    • Given the joint probability distribution(X,Y)=(0.10.30.20.4)P(X, Y) = \begin{pmatrix}0.1 & 0.3 \\0.2 & 0.4\end{pmatrix}P(X,Y)=(0.10.20.30.4)calculate the mutual information I(X;Y)I(X;Y)I(X;Y).
  5. Entropy Optimization in Neural Networks:
    • Implement an entropy-based regularization term in the loss function of a neural network.
    • Train the network on a small dataset and analyze the impact of this regularization on the model's performance.
  6. Differential Entropy Calculation:
    • For a continuous random variable with probability density function f(x)=e−xf(x) = e^{-x}f(x)=e−x for x≥0x \geq 0x≥0, calculate the differential entropy.
  7. Entropy-Based Clustering:
    • Implement an entropy-based method for clustering a dataset.
    • Apply your method to a synthetic dataset and evaluate the quality of the clusters.
These exercises should provide a comprehensive set of tasks to deepen your understanding and skills in entropy calculation and optimization.

as you can see it gives solid exercices so you can train whatever chapter you're on

keep doing that over and over you don't have to love math honestly i fucking hate math
I can feel the future job applications already. Eternal Wageslavemaxxing will go crazy
 
  • +1
  • JFL
Reactions: Deleted member 55855 and Yahya
ik ur in america so this may not be as applicable for you but i find using madasmaths .com and doing some 2 star and 3 star questions until i mconfident and then trying to do 4 and 5 star.

i did this for a week experimenting with modafinil and got a lot of revision done ( i think is called studying in america)
 
Fr bro, seriously when the fuck am i gonna use quadratic equations💀
Math is probably the only useful shit in school
You won't need if you work in mcdonalds prob
 
Because ur rotting on .org
 

Similar threads

spermium
Replies
20
Views
162
20/04/2008
20/04/2008
isis_Bleach
Replies
28
Views
264
nathan
nathan
JoChico
Replies
78
Views
500
PsychoDsk
PsychoDsk
H
Replies
18
Views
162
hej1377
H
pandamonium
Replies
9
Views
90
actualunderstander
actualunderstander

Users who are viewing this thread

Back
Top