God did it?
All the statements or implications of God being in complete control over every action or reaction on the planet in scripture and since were all spoken by man.
Some of God's actual earliest words to man were Him being clear on giving dominion on the earth to man. He even had man name His various creations.
God, Himself, said there was someone that had power on this earth other than him, two in fact, satan and man.
Where did the confusion after that come from?
Who benefits the most from people thinking God is in control of everything, especially the bad things?
Who gets extreme glee from people portraying God as a killer and other sick things He is not?
Who has greater power to cause sickness and destruction in people's lives if they believe it's of God and actually welcome it at the first sign instead of rebuking or resisting?
Just something for American Christians™ to ponder.