Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Multiplying Columns by Scalars in Pandas

Tags:

python

pandas

Suppose I have a pandas DataFrame with two columns named 'A' and 'B'.

Now suppose I also have a dictionary with keys 'A' and 'B', and the dictionary points to a scalar. That is, dict['A'] = 1.2 and similarly for 'B'.

Is there a simple way to multiply each column of the DataFrame by these scalars?

Cheers!

like image 889
Eduardo Sahione Avatar asked Oct 30 '25 09:10

Eduardo Sahione


1 Answers

As Wouter said, the recommended method is to convert the dict to a pandas.Series and multiple the two objects together:

result = df * pd.Series(myDict)

like image 99
Wes McKinney Avatar answered Nov 01 '25 00:11

Wes McKinney