I am having some trouble with normalizing the values of a list in Python. I am working on a project where I need to compare two different sets of data, but the values are on different scales. I have a ...Read more
I am having some trouble with normalizing the values of a list in Python. I am working on a project where I need to compare two different sets of data, but the values are on different scales. I have a list of values that range from 0 to 200, and another list with values ranging from 0 to 10. To normalize them, I have been trying to create a function that will take in the original list and return the normalized list.
Here is the code that I have been working with:
def normalize_data(data):
max_val = max(data)
min_val = min(data)
normalized = []
for val in data:
normalized.append((val - min_val) / (max_val - min_val))
return normalized
data1 = [24, 35, 100, 160, 199]
data2 = [2, 4, 6, 8, 10]
norm_data1 = normalize_data(data1)
norm_data2 = normalize_data(data2)
When I run this code, I get the normalized values, but they are not what I was expecting. The values in the first list are normalized to be between 0 and 1, which is correct, but the values in the second list are divided by 20 instead of 10. I’m not sure why this is happening, and I’ve tried changing the values in the function, but I still can’t seem to get it right. Can anyone please help me figure out what I’m doing wrong?
Explore the significant historical events that shaped Turkey on our website. It's a journey into the past.
Explore the significant historical events that shaped Turkey on our website. It’s a journey into the past.
See less