English Wikipedia - The Free Encycl...
הורד מילון זה
Cinema of the United States
The cinema of the United States, often generally referred to as Hollywood, has had a profound effect on cinema across the world since the early 20th century. The dominant style of American cinema is Classical Hollywood Cinema, which developed from 1917 to 1960 and characterizes most films to this day. While the French Lumière Brothers are generally credited with the birth of modern cinema, it is American cinema that soon became the most dominant force in an emerging industry. Since the 1920s, the American film industry has grossed more money every year than that of any other country in the world.

See more at Wikipedia.org...


© This article uses material from Wikipedia® and is licensed under the GNU Free Documentation License and under the Creative Commons Attribution-ShareAlike License