Hello everyone, I had an interesting thing come up earlier in my programming, and I'm trying to wrap my mind around why it occurred. I wanted to take two dictionaries with the same keys and combine their values to make one, big super dictionary.
def combine(d1, d2): for key in d1: if key in d2: d2[key] += d1[key] When I assign values to each dictionary, this works perfectly. d1 = {'a': 1, 'b': 2, 'c': 3} d2 = {'a': 10, 'b': 20, 'c': 30} combine(d1, d2) d1 = {'a': 1, 'b': 2, 'c': 3} d2 = {'a': 11, 'b': 22, 'c': 33} When I initialize the class which holds these dictionaries, though, I need to make sure that all the keys contained in d2 match the keys of d1. Thus I tried: d1 = {'a': 0, 'b': 0, 'c': 0} d2 = d1 My understanding was that d2 looked at d1 once, grabbed its keys and values, and went off to do its own thing. Just as if you typed: x = 3 y = x x = 6 y still holds the value 3. This turns out not to be the case with dictionaries, and I'm not sure why this is so. Why when you change a dictionary's keys in place does a copied dictionary take on the new values? Thanks for any help you can provide. Best, Ryan _______________________________________________ Tutor maillist - Tutor@python.org To unsubscribe or change subscription options: http://mail.python.org/mailman/listinfo/tutor