Holidays

Image © Pexels via Pexels

The main holidays I celebrated growing up through the decades, and still do, were New Year, Easter, Bonfire Night and Christmas.  Celebrating Halloween came much later in my adult years.  All these contain happy memories with my family, kids and grandkids.

These holidays carry their traditions and traditions meant a lot to my Mom and they mean a lot to me because as long as I carry on doing the things she did, and my own, they will never die out in a world where such things don’t seem to matter anymore to a lot of people.  The traditions that Mom loved, and the ones we did together, forever bring a smile to my face and happy memories and as long as I can do them I will and keep them alive, not just for me but for my grandkids and Mom too because I know she is here in spirit to enjoy them too.    

About Holidays

A holiday is a day or other period set aside for festivals or recreation.  They appear at various times during the four seasons.  Public holidays are set by public authorities and vary by state or region.  Religious holidays are set by religious organisations for their members and are often also observed as public holidays in religious-majority countries.  Some religious holidays, such as Christmas, have become secularised by part or all of those who observe them.  In addition to secularisation, many holidays have become commercialised due to the growth of industry.

Holidays can be thematic, celebrating or commemorating particular groups, events, or ideas, or non-thematic, days of rest that do not have any particular meaning.  In Commonwealth English, the term can refer to any period of rest from work, (a.k.a. vacations) or school holidays. Holidays typically refer to the period from Thanksgiving (in the United States, Canada, Grenada, Saint Lucia, Liberia, and unofficially in countries like Brazil, Germany and the Philippines.  It is also observed in the Dutch town of Leiden and the Australian territory of Norfolk Island) to New Year’s. 

If there is a celebration of some sort you will usually see lots of colourful fireworks.  

Image © Pexels via Pexels

A great display of blue fireworks.

New Year

You can read about New Year here.

Easter

You can read about Easter here.

Halloween

You can read about Halloween here.

Bonfire Night

You can read about Bonfire Night here.

Christmas

You can read about Christmas here.

Terminology

The word holiday comes from the Old English word hāligdæg (hālig “holy” + dæg “day”).  The word originally referred only to special religious days.

The word holiday has differing connotations in different regions.  In the United Kingdom and other Commonwealth nations, the word may refer to the period where leave from one’s duties has been agreed upon.  This time is usually set aside for rest, travel, or participation in recreational activities, with entire industries targeted to coincide with or enhance these experiences. The days of leave may not coincide with any specific customs or laws. Employers and educational institutes may designate holidays themselves, which may or may not overlap nationally or culturally relevant dates, which again comes under this connotation, but it is the first implication detailed that this article is concerned with.  Modern use varies geographically.  In the United States, the word is used exclusively to refer to the nationally, religiously, or culturally observed day(s) of rest or celebration or the events themselves and is known as a vacation.  In North America, it means any dedicated day or period of celebration.   

Global Holidays 

The celebration of the New Year has been a common holiday across cultures for at least four millennia.  Such holidays normally celebrate the last day of the year and the arrival of the next year in a calendar system.  In modern cultures using the Gregorian calendar, the New Year’s celebration spans New Year’s Eve on the 31st of December and New Year’s Day on the 1st of January.  However, other calendar systems also have New Year’s celebrations, such as Chinese New Year and Vietnamese Tet.  New Year’s Day is the most common public holiday, observed by all countries using the Gregorian calendar except Israel.

Christmas is a popular holiday globally due to the spread of Christianity.  The holiday is recognised as a public holiday in many countries in Europe, the Americas, Africa and Australasia and is celebrated by over 2 billion people.  Although a holiday with religious origins, Christmas is often celebrated by non-Christians as a secular holiday.  For example, 61% of British people celebrate Christmas in an entirely secular way.  Christmas has also become a tradition in some non-Christian countries.  For example, for many Japanese people, it has become customary to buy and eat fried chicken on Christmas.  

Public Holidays 

Read more about Public Holidays here.  

Substitute Holidays 

If a holiday coincides with another holiday or a weekend day a substitute holiday may be recognised in lieu.  In the United Kingdom, the government website states that “If a bank holiday is on a weekend, a substitute weekday becomes a bank holiday, normally the following Monday.”  The process of moving a holiday from a weekend day to the following Monday is known as Mondayisation in New Zealand.  

Religious Holidays 

Many holidays are linked to faiths and religions (see etymology above).  Christian holidays are defined as part of the liturgical year, the chief ones being Easter and Christmas.  The Orthodox Christian and Western-Roman Catholic patronal feast day or name day is celebrated on each place’s patron saint’s day, according to the Calendar of Saints.  Jehovah’s Witnesses annually commemorate The Memorial of Jesus Christ’s Death but do not celebrate other holidays with any religious significance such as Easter, Christmas or New Year.  This holds especially true for those holidays that have combined and absorbed rituals, overtones or practices from non-Christian beliefs into the celebration, as well as those holidays that distract from or replace the worship of Jehovah.  In Islam, the largest holidays are Eid al-Fitr (immediately after Ramadan) and Eid al-Adha (at the end of the Hajj).  Ahmadi Muslims additionally celebrate Promised Messiah Day, Promised Reformer Day, and Khilafat Day, but contrary to popular belief, neither are regarded as holidays.  Hindus, Jains and Sikhs observe several holidays, one of the largest being Diwali (Festival of Light). Japanese holidays as well as a few Catholic holidays contain heavy references to several different faiths and beliefs.  Celtic, Norse, and Neopagan holidays follow the order of the Wheel of the Year.  For example, Christmas ideas like decorating trees and colours (green, red, and white) have very similar ideas to modern Wicca (a modern Pagan belief) Yule which is a lesser Sabbat of the wheel of the year.  Some are closely linked to Swedish festivities.  The Bahaʼí Faith observes 11 annual holidays on dates determined using the Bahaʼí calendar.  Jews have two holiday seasons, the Spring Feasts of Pesach (Passover) and Shavuot (Weeks, called Pentecost in Greek) and the Fall Feasts of Rosh Hashanah (Head of the Year), Yom Kippur (Day of Atonement), Sukkot (Tabernacles), and Shemini Atzeret (Eighth Day of Assembly). 

See Also

You can see references and sources to the above articles here.  The above was sourced from a page on Wikipedia and is subject to change.  

Blog Posts

Links

PexelsThe image shown at the top of this page is the copyright of Pexels  You can find more free stock photos on there.

Television

Image © of Max Rahubovskiy via Pexels

Most of us have grown up watching a television screen of some sort.  For me, television was at its best in the 1970’s and 1980’s when it was proper family entertainment. 

I don’t watch much telly these days (and I certainly DO NOT watch the bullshit so-called news).  Like films, it has all become too woke for my liking.  What was once entertainment has become a form of brainwashing and lecturing and I don’t watch it live anymore. I don’t turn on my television much unless it is to watch a DVD via my DVD player, watch YouTube, or Amazon Prime, or watch something decent that fits in with my likes via my Amazon Fire TV stick 4K Max.  

I have plenty of favourite television programs over the decades as a child and older, but watching them with family in my favourite decade, the 70’s, will always hold the most special memories for me. 

I like most TV genres with my favourite being Horror and Science Fiction ones.  I have favourite actors and actresses the same as anyone else does and they will be shown on this page.  I am not going to list every telly programme I have watched in my lifetime, that would be IMPOSSIBLE to remember but I will list programmes I have watched and enjoyed that I think are worth watching for someone else but of course, your opinions may differ from mine, that’s life.  

About Televison

Television (TV), also referred to as telly, is a telecommunication medium for transmitting moving images and sound.  The term can refer to a TV set or the medium of TV transmission.  Television is a mass medium for advertising, entertainment, news, and sports.

Television became available in crude experimental forms in the late 1920’s, but only after several years of further development was the new technology marketed to consumers.  After World War II, an improved form of black-and-white TV broadcasting became popular in the United Kingdom (U.K.) and the United States (U.S.), and TV sets became commonplace in homes, businesses, and institutions.  During the 1950’s, telly was the primary medium for influencing public opinion.  In the mid-1960’s, colour broadcasting was introduced in the U.S. and most other developed countries.

The availability of various types of archival storage media such as Betamax and Video Home System (VHS) tapes, Laser Discs, high-capacity hard disk drives, Compact Discs (CD’s), Digital Versatile Discs (DVD’s, flash drives, high-definition (HD) DVD’s and Blu-ray Discs, and cloud digital video recorders has enabled viewers to watch pre-recorded material, such as movies, at home on their own time schedule.  For many reasons, especially the convenience of remote retrieval, the storage of television and video programming now also occurs on the cloud (such as the video-on-demand service by Netflix).  At the end of the first decade of the 2000’s, digital television transmissions greatly increased in popularity.  Another development was the move from standard-definition TV (SDTV) (576i, with 576 interlaced lines of resolution and 480i) to high-definition TV (HDTV), which provides a resolution that is substantially higher.  HDTV may be transmitted in different formats (1080p, 1080i and 720p).  Since 2010, with the invention of smart television, Internet television has increased the availability of television programs and movies via the Internet through streaming video services such as Netflix, Amazon Prime Video, and Hulu.

In 2013, 79% of the world’s households owned a television set.  The replacement of earlier cathode-ray tube (CRT) screen displays with compact, energy-efficient, flat-panel alternative technologies such as liquid-crystal display (LCD) both fluorescent backlit and light-emitting diode (LED), organic light-emitting diode (OLED) and plasma displays was a hardware revolution that began with computer monitors in the late 1990’s.  Most television sets sold in the 2000’s were flat-panel, mainly LED’s.  Major manufacturers announced the discontinuation of CRT, Digital Light Processing (DLP), plasma, and even fluorescent-backlit LCD TV’s by the mid-2010’s.  In the near future, LED’s are expected to be gradually replaced by OLED TV’s.  Also, major manufacturers have announced that they will increasingly produce smart TV’s in the mid-2010’s.  Smart TVs with integrated Internet and Web 2.0 functions became the dominant form of television by the late 2010’s.

Television signals were initially distributed only as terrestrial television using high-powered radio-frequency television transmitters to broadcast the signal to individual television receivers.  Alternatively, television signals are distributed by coaxial cable or optical fibre, satellite systems and, since the 2000’s via the Internet.  Until the early 2000’s, these were transmitted as analogue signals, but a transition to digital television was expected to be completed worldwide by the late 2010’s.  A standard television set consists of multiple internal electronic circuits, including a tuner for receiving and decoding broadcast signals.  A visual display device that lacks a tuner is correctly called a video monitor rather than a television.

Image © Wags05 via Wikipedia

Flat-screen televisions for sale at a consumer electronics store in 2008.

Etymology

The word television comes from the Ancient Greek τῆλε (tele) meaning far, and Latin visio meaning sight.  The first documented usage of the term dates back to 1900, when the Russian scientist Constantin Perskyi used it in a paper that he presented in French at the first International Congress of Electricity, which ran from the 18th to the 25th of August 1900 during the International World Fair in Paris.

The anglicised version of the term was first attested in 1907 when it was classed as a theoretical system to transmit moving images over telegraph or telephone wires.  It was formed in English or borrowed from the French word télévision.  In the 19th century and early 20th century, other proposals for the name of a then-hypothetical technology for sending pictures over distance were telephote (1880) and televista (1904).

The abbreviation TV is from 1948.  The use of the term to mean a television set dates from 1941.  The use of the term to mean television as a medium dates from 1927.

The term telly is more common in the United Kingdom (U.K).  The slang term the tube or the boob tube derives from the bulky cathode-ray tube used on most TV’s until the advent of flat-screen tellies.  

The History Of Television

Mechanical Television

Read more about Mechanical Television here.

Facsimile transmission systems (FAX) for still photographs pioneered methods of mechanical scanning of images in the early 19th century.  Alexander Bain introduced the facsimile machine between 1843 and 1846.  Frederick Bakewell demonstrated a working laboratory version in 1851.  Willoughby Smith discovered the photoconductivity of the element selenium in 1873.  As a 23-year-old German university student, Paul Julius Gottlieb Nipkow proposed and patented the Nipkow disk in 1884 in Berlin.  This was a spinning disk with a spiral pattern of holes in it, so each hole scanned a line of the image.  Although he never built a working model of the system, variations of Nipkow’s spinning disk image rasteriser became exceedingly common.  Constantin Perskyi coined the word television (TV) in a paper read to the International Electricity Congress at the International World Fair in Paris on the 24th of August, 1900.  Perskyi’s paper reviewed the existing electromechanical technologies, mentioning the work of Nipkow and others.  However, it was not until 1907 that developments in amplification tube technology by Lee de Forest and Arthur Korn, among others, made the design practical.

The first demonstration of the live transmission of images was by Georges Rignoux and A. Fournier in Paris in 1909.  A matrix of 64 selenium cells, individually wired to a mechanical commutator, served as an electronic retina.  In the receiver, a type of Kerr cell modulated the light and a series of differently angled mirrors attached to the edge of a rotating disc scanned the modulated beam onto the display screen.  A separate circuit regulated synchronisation.  The 8×8 pixel resolution in this proof-of-concept demonstration was just sufficient to clearly transmit individual letters of the alphabet.  An updated image was transmitted several times each second.

In 1911, Boris Rosing and his student Vladimir Zworykin created a system that used a mechanical mirror-drum scanner to transmit, in Zworykin’s words, “very crude images” over wires to the Braun tube (cathode-ray tube) in the receiver.  Moving images was not possible because in the scanner the sensitivity was not enough and the selenium cell was very laggy.

In 1921, Edouard Belin sent the first image via radio waves with his belinograph.

By the 1920’s, when amplification made TV practical, Scottish inventor John Logie Baird employed the Nipkow disk in his prototype video systems.  On the 25th of March, 1925, Baird gave the first public demonstration of televised silhouette images in motion, at Selfridges’s department store in London.  Since human faces had inadequate contrast to show up in his primitive system, he televised a ventriloquist’s dummy named Stooky Bill, whose painted face had higher contrast, talking and moving.  By the 26th of January, 1926, he had demonstrated before members of the Royal Institution the transmission of an image of a face in motion by radio.  This is widely regarded as the world’s first true public TV demonstration, exhibiting light, shade and detail.  Baird’s system used the Nipkow disk for both scanning the image and displaying it.  A brightly illuminated subject was placed in front of a spinning Nipkow disk set with lenses which swept images across a static photocell.  The thallium sulphide (Thalofide) cell, developed by Theodore Case in the United States (U.S.), detected the light reflected from the subject and converted it into a proportional electrical signal.  This was transmitted by Amplitude Modulation (AM) radio waves to a receiver unit, where the video signal was applied to a neon light behind a second Nipkow disk rotating synchronised with the first.  The brightness of the neon lamp was varied in proportion to the brightness of each spot on the image.  As each hole in the disk passed by, one scan line of the image was reproduced.  Baird’s disk had 30 holes, producing an image with only 30 scan lines, just enough to recognize a human face.  In 1927, Baird transmitted a signal over 438 miles (705 km) of telephone line between London and Glasgow.  Baird’s original televisor now resides in the Science Museum, South Kensington.

In 1928, Baird’s company (Baird Television Development Company/Cinema Television) broadcast the first transatlantic TV signal, between London and New York, and the first shore-to-ship transmission.  In 1929, he became involved in the first experimental mechanical TV service in Germany.  In November of the same year, Baird and Bernard Natan of Pathe established France’s first television company, Television-Baird-Natan.  In 1931, he made the first outdoor remote broadcast, of The Derby.  In 1932, he demonstrated ultra-short-wave (USW) television.  Baird’s mechanical system reached a peak of 240 lines of resolution on the British Broadcasting Company’s (BBC) telecasts in 1936, though the mechanical system did not scan the televised scene directly.  Instead, a 17.5 mm film was shot, rapidly developed and then scanned while the film was still wet.

A U.S. inventor, Charles Francis Jenkins, also pioneered the television.  He published an article on Motion Pictures by Wireless in 1913 and transmitted moving silhouette images for witnesses in December 1923.  On the 13th of June, 1925, he publicly demonstrated the synchronised transmission of silhouette pictures.  In 1925 Jenkins used the Nipkow disk and transmitted the silhouette image of a toy windmill in motion, over a distance of 5 miles (8 km), from a naval radio station in Maryland to his laboratory in Washington, D.C., using a lensed disk scanner with a 48-line resolution.  He was granted U.S. Patent No. 1,544,156 (Transmitting Pictures over Wireless) on the 30th of June, 1925 and filed it on the 13th of March, 1922.

Herbert E. Ives and Frank Gray of Bell Telephone Laboratories gave a dramatic demonstration of mechanical television on the 7th of April, 1927.  Their reflected-light television system included both small and large viewing screens.  The small receiver had a 2-inch-wide by 2.5-inch-high screen (5 by 6 cm).  The large receiver had a screen 24 inches wide by 30 inches high (60 by 75 cm).  Both sets could reproduce reasonably accurate, monochromatic, moving images.  Along with the pictures, the sets received synchronised sound.  The system transmitted images over two paths.  The first was a copper wire link from Washington to New York City, then a radio link from Whippany, New Jersey.  Comparing the two transmission methods, viewers noted no difference in quality.  Subjects of the telecast included Secretary of Commerce Herbert Hoover.  A flying-spot scanner beam illuminated these subjects.  The scanner that produced the beam had a 50-aperture disk.  The disc revolved at a rate of 18 frames per second, capturing one frame about every 56 milliseconds (today’s systems typically transmit 30 or 60 frames per second, or one frame every 33.3 or 16.7 milliseconds respectively).  Telly historian Albert Abramson underscored the significance of the Bell Labs demonstration and said, “It was in fact the best demonstration of a mechanical television system ever made to this time. It would be several years before any other system could even begin to compare with it in picture quality.”

In 1928, WRGB, then W2XB, was started as the world’s first TV station.  It was broadcast from the General Electric (GE) facility in Schenectady, N.Y.  It was popularly known as WGY Television.  Meanwhile, in the Soviet Union, Leon Theremin had been developing a mirror drum-based television, starting with 16 lines resolution in 1925, then 32 lines and eventually 64 using interlacing in 1926.  As part of his thesis, on the 7th of May, 1926, he electrically transmitted, and then projected, near-simultaneous moving images on a 5-square-foot (0.46 m2) screen.

By 1927 Theremin had achieved an image of 100 lines, a resolution that was not surpassed until May 1932 by Radio Corporation of America (RCA), with 120 lines.

On Christmas Day in 1926, Kenjiro Takayanagi demonstrated a television system with a 40-line resolution that employed a Nipkow disk scanner and cathode ray tubes (CRT) display at Hamamatsu Industrial High School in Japan.  This prototype is still on display at the Takayanagi Memorial Museum at Shizuoka University, Hamamatsu Campus.  His research in creating a production model was halted by the SCAP after World War II.

Because only a limited number of holes could be made in the disks, and disks beyond a certain diameter became impractical, image resolution on mechanical television broadcasts was relatively low, ranging from about 30 lines up to 120 or so.  Nevertheless, the image quality of 30-line transmissions steadily improved with technical advances, and by 1933 the United Kingdom (U.K.) broadcasts using the Baird system were remarkably clear.  A few systems ranging into the 200-line region also went on the air. Two of these were the 180-line system that Compagnie des Compteurs installed in Paris in 1935, and the 180-line system that Peck Television Corp. started in 1935 at station VE9AK in Montreal.  The advancement of all-electronic television (including image dissectors and other camera tubes and CRT’s for the reproducer) marked the start of the end for mechanical systems as the dominant form of television.  Mechanical TV, despite its inferior image quality and generally smaller picture, would remain the primary television technology until the 1930’s.  The last mechanical telecasts ended in 1939 at stations run by a lot of public universities in the U.S.

Image © of Hzeller via Wikipedia
Image © of Orrin Dunlap, Jnr.

John Logie Baird in 1925 with his televisor equipment and dummies James (on the left) and Stooky Bill (on the right). 

The above image is on page 650 of Popular Radio magazine, Vol. 10, No. 7, dated November 1926. It was published by Popular Radio, Inc. in New York, U.S.A.  You can download a copy of this magazine via World Radio History by clicking here.

Electronic Television 

Read more about Electronic Television here.

In 1897, English physicist J. J. Thomson was able, in his three well-known experiments, to deflect cathode rays, a fundamental function of the modern cathode-ray tube. The earliest version of the cathode ray tube (CRT) was invented by the German physicist Ferdinand Braun in 1897 and is also known as the Braun tube.  It was a cold-cathode diode, a modification of the Crookes tube, with a phosphor-coated screen.  Braun was the first to conceive the use of a CRT as a display device.  The Braun tube became the foundation of 20th-century television.  In 1906 the Germans Max Dieckmann and Gustav Glage produced raster images for the first time in a CRT.  In 1907, Russian scientist Boris Rosing used a CRT in the receiving end of an experimental video signal to form a picture.  He managed to display simple geometric shapes on the screen.

In 1908, Alan Archibald Campbell-Swinton, fellow of the Royal Society, published a letter in the scientific journal Nature in which he described how distant electric vision could be achieved by using a cathode-ray tube, or Braun tube, as both a transmitting and receiving device, he expanded on his vision in a speech given in London in 1911 and reported in The Times and the Journal of the Rontgen Society in another letter to Nature published in October 1926.  Campbell-Swinton also announced the results of some not-very-successful experiments he had conducted with G. M. Minchin and J. C. M. Stanton.  They attempted to generate an electrical signal by projecting an image onto a selenium-coated metal plate that was simultaneously scanned by a cathode ray beam.  These experiments were conducted before March 1914, when Minchin died, but they were later repeated by two different teams in 1937, by H. Miller and J. W. Strange from Electric and Musical Industries Ltd. (EMI), and by H. Iams and A. Rose from Radio Corporation of America (RCA).  Both teams succeeded in transmitting very faint images with the original Campbell-Swinton’s selenium-coated plate.  Although others had experimented with using a cathode-ray tube as a receiver, the concept of using one as a transmitter was novel.  The first cathode-ray tube to use a hot cathode was developed by John B. Johnson (who gave his name to the term Johnson noise) and Harry Weiner Weinhart of Western Electric and became a commercial product in 1922.

In 1926, Hungarian engineer Kalman Tihanyi designed a television (TV) system using fully electronic scanning and display elements and employing the principle of charge storage within the scanning (or camera) tube.  The problem of low sensitivity to light resulting in low electrical output from transmitting (or camera) tubes would be solved with the introduction of charge-storage technology by Kalman Tihanyi beginning in 1924.  His solution was a camera tube that accumulated and stored electrical charges (photoelectrons) within the tube throughout each scanning cycle.  The device was first described in a patent application he filed in Hungary in March 1926 for a television system he called Radioskop.  After further refinements included in a 1928 patent application, Tihanyi’s patent was declared void in Great Britain in 1930, so he applied for patents in the United States (U.S.).  Although his breakthrough would be incorporated into RCA’s iconoscope design in 1931, the U.S. patent for Tihanyi’s transmitting tube would not be granted until May 1939.  The patent for his receiving tube had been granted the previous October.  Both patents had been purchased by RCA prior to their approval.  Charge storage remains a basic principle in the design of imaging devices for television to the present day.  On Christmas Day, 1926, at Hamamatsu Industrial High School in Japan, Japanese inventor Kenjiro Takayanagi demonstrated a TV system with a 40-line resolution that employed a CRT display.  This was the first working example of a fully electronic television receiver and Takayanagi’s team later made improvements to this system parallel to other TV developments.  Takayanagi did not apply for a patent.

In the 1930’s, Allen B. DuMont made the first CRT to last 1,000 hours of use, which was one of the factors that led to the widespread adoption of TV.

On the 7th of September 1927, U.S. inventor Philo Farnsworth’s image dissector camera tube transmitted its first image, a simple straight line, at his laboratory at 202 Green Street in San Francisco.  By the 3rd of September 1928, Farnsworth had developed the system sufficiently to hold a demonstration for the press.  This is widely regarded as the first electronic television demonstration.  In 1929, the system was improved further by the elimination of a motor generator, so that his television system now had no mechanical parts.  That year, Farnsworth transmitted the first live human images with his system, including a three-and-a-half-inch image of his wife Elma (nicknamed Pem) with her eyes closed (possibly due to the bright lighting required).

Meanwhile, Vladimir Zworykin was also experimenting with the cathode-ray tube to create and show images.  While working for Westinghouse Electric in 1923, he began to develop an electronic camera tube.  But in a 1925 demonstration, the image was dim, had low contrast, and poor definition, and was stationary.  Zworykin’s imaging tube never got beyond the laboratory stage but RCA, which acquired the Westinghouse patent, asserted that the patent for Farnsworth’s 1927 image dissector was written so broadly that it would exclude any other electronic imaging device.  Thus RCA, on the basis of Zworykin’s 1923 patent application, filed a patent interference suit against Farnsworth. The U.S. Patent Office examiner disagreed in a 1935 decision, finding priority of invention for Farnsworth against Zworykin.  Farnsworth claimed that Zworykin’s 1923 system could not produce an electrical image of the type to challenge his patent.  Zworykin received a patent in 1928 for a colour transmission version of his 1923 patent application.  He also divided his original application in 1931.  Zworykin was unable or unwilling to introduce evidence of a working model of his tube that was based on his 1923 patent application. In September 1939, after losing an appeal in the courts, and being determined to go forward with the commercial manufacturing of television equipment, RCA agreed to pay Farnsworth US$1 million over a ten-year period, in addition to license payments, to use his patents.

In 1933, RCA introduced an improved camera tube that relied on Tihanyi’s charge storage principle.  Called the Iconoscope by Zworykin, the new tube had a light sensitivity of about 75,000 lux and thus was claimed to be much more sensitive than Farnsworth’s image dissector.  However, Farnsworth had overcome his power issues with his Image Dissector through the invention of a completely unique multipactor device that he began work on in 1930, and demonstrated in 1931.  This small tube could amplify a signal reportedly to the 60th power or better and showed great promise in all fields of electronics.  Unfortunately, an issue with the multipactor was that it wore out at an unsatisfactory rate.

At the Berlin Radio Show in August 1931 in Berlin, Manfred von Ardenne gave a public demonstration of a television system using a CRT for both transmission and reception, the first completely electronic television transmission.  However, Ardenne had not developed a camera tube, using the CRT instead as a flying-spot scanner to scan slides and film.  Ardenne achieved his first transmission of TV pictures on Christmas Eve, 1933, followed by test runs for a public television service in 1934.  The world’s first electronically scanned TV service started in Berlin in 1935, the Fernsehsender Paul Nipkow, culminating in the live broadcast of the 1936 Summer Olympic Games from Berlin to public places all over Germany.

Philo Farnsworth gave the world’s first public demonstration of an all-electronic TV system, using a live camera, at the Franklin Institute of Philadelphia on the 25th of August 1934, and for ten days afterwards.  Mexican inventor Guillermo Gonzalez Camarena also played an important role in early telly.  His experiments with TV (known as telectroescopía at first) began in 1931 and led to a patent for the trichromatic field sequential system colour TV in 1940.  In Britain, the EMI engineering team led by Isaac Shoenberg applied in 1932 for a patent for a new device they called the Emitron, which formed the heart of the cameras they designed for the British Broadcasting Company (BBC).   On the 2nd of November 1936, a 405-line broadcasting service employing the Emitron began at studios in Alexandra Palace, and transmitted from a specially built mast atop one of the Victorian building’s towers.  It alternated for a short time with Baird’s mechanical system in adjoining studios but was more reliable and visibly superior.  This was the world’s first regular high-definition television (HDTV) service. 

The original U.S. iconoscope was noisy, had a high ratio of interference to signal, and ultimately gave disappointing results, especially when compared to the high-definition (HD) mechanical scanning systems that became available.  The Electric and Musical Industries Ltd. (EMI) team, under the supervision of Isaac Shoenberg, analysed how the iconoscope (or Emitron) produces an electronic signal and concluded that its real efficiency was only about 5% of the theoretical maximum.  They solved this problem by developing, and patenting in 1934, two new camera tubes dubbed super-Emitron and CPS Emitron.  The super-Emitron was between ten and fifteen times more sensitive than the original Emitron and iconoscope tubes and, in some cases, this ratio was considerably greater.  It was used for outside broadcasting by the British Broadcasting Company (BBC), for the first time, on Armistice Day 1937, when the general public could watch on a TV set as the King laid a wreath at the Cenotaph.  This was the first time that anyone had broadcast a live street scene from cameras installed on the roof of neighbouring buildings because neither Farnsworth nor R.C.A. would do the same until the 1939 New York World’s Fair.

On the other hand, in 1934, Zworykin shared some patent rights with the German licensee company Telefunken.  The image iconoscope (Superikonoskop in Germany) was produced as a result of the collaboration.  This tube is essentially identical to the super-Emitron.  The production and commercialisation of the super-Emitron and image iconoscope in Europe were not affected by the patent war between Zworykin and Farnsworth, because Dieckmann and Hell had priority in Germany for the invention of the image dissector, having submitted a patent application for their Lichtelektrische Bildzerlegerrohre fur Fernseher (Photoelectric Image Dissector Tube for Television) in Germany in 1925, two years before Farnsworth did the same in the United States.  The image iconoscope (Superikonoskop) became the industrial standard for public broadcasting in Europe from 1936 until 1960 when it was replaced by the vidicon and plumbicon tubes.  Indeed, it was the representative of the European tradition in electronic tubes competing against the American tradition represented by the image orthicon.  The German company Heimann produced the Superikonoskop for the 1936 Berlin Olympic Games, later Heimann also produced and commercialised it from 1940 to 1955.  From 1952 to 1958 the Dutch company Philips finally produced and commercialised the image iconoscope and multicon.

U.S. television broadcasting, at the time, consisted of a variety of markets in a wide range of sizes, each competing for programming and dominance with separate technology, until deals were made and standards agreed upon in 1941.  RCA, for example, used only Iconoscopes in the New York area, but Farnsworth Image Dissectors in Philadelphia and San Francisco.  In September 1939, RCA agreed to pay the Farnsworth Television and Radio Corporation royalties over the next ten years for access to Farnsworth’s patents.  With this historic agreement in place, RCA integrated much of what was best about Farnsworth Technology into their systems.  In 1941, the United States implemented 525-line television.  Electrical engineer Benjamin Adler played a prominent role in the development of television.

The world’s first 625-line TV standard was designed in the Soviet Union in 1944 and became a national standard in 1946.  The first broadcast in 625-line standard occurred in Moscow in 1948.  The concept of 625 lines per frame was subsequently implemented in the European CCIR standard.  In 1936, Kalman Tihanyi described the principle of plasma display, the first flat panel display system.

Early electronic TV sets were large and bulky, with analogue circuits made of vacuum tubes.  Following the invention of the first working transistor at Bell Labs, Sony founder Masaru Ibuka predicted in 1952 that the transition to electronic circuits made of transistors would lead to smaller and more portable TV sets.  The first fully transistorised, portable solid-state television set was the 8-inch Sony TV8-301, developed in 1959 and released in 1960.  This began the transformation of TV viewership from a communal viewing experience to a solitary viewing experience.  By 1960, Sony had sold over 4 million portable TV sets worldwide.

Image © unknown via Wikipedia and is in the public domain

Ferdinand Braun.

Image © unknown via Wikipedia and is in the public domain

Vladimir Zworykin in 1929.

The Westinghouse Electric and Manufacturing Company research engineer can be seen here with Mildred Birt demonstrating electronic television.

The broadcast images are projected on a mirror on the top of the cabinet making it possible for many to watch.

Image © unknown via Wikipedia

Manfred von Ardenne in 1933. 

Image © unknown via Wikipedia and is in the public domain

A Radio Corporation Of America Advertisement.

This RCA advertisement from the Radio & Television magazine (Vol. X, No. 2, June, 1939) is for the beginning of regular experimental television broadcasting from the NBC studios to the New York metropolitan area, U.S.A.

Image © unknown via Wikipedia and is in the public domain

An Indian-head test pattern.

This 2F21 monoscope tube motif was used from 1940 until the advent of colour television.  It was displayed when a television station first signed on every day.

Colour Television 

Read more about Colour Television here

The basic idea of using three monochrome images to produce a colour image had been experimented with almost as soon as black-and-white televisions (TV) had first been built. Although he gave no practical details, among the earliest published proposals for TV was one by Maurice Le Blanc, in 1880, for a colour system, including the first mentions in TV literature of line and frame scanning.  Polish inventor Jan Szczepanik patented a colour TV system in 1897, using a selenium photoelectric cell at the transmitter and an electromagnet controlling an oscillating mirror and a moving prism at the receiver.  But his system contained no means of analyzing the spectrum of colours at the transmitting end, and could not have worked as he described it.  Another inventor, Hovannes Adamian, also experimented with colour television as early as 1907.  The first colour TV project was claimed by him, and was patented in Germany on the 31st of March, 1908, patent No. 197183, then in Britain, on the 1st of April 1908, patent No. 7219, in France (patent No. 390326) and in Russia in 1910 (patent No. 17912).

Scottish inventor John Logie Baird demonstrated the world’s first colour transmission on the 3rd of July, 1928, using scanning discs at the transmitting and receiving ends with three spirals of apertures, each spiral with filters of a different primary colour and three light sources at the receiving end, with a commutator to alternate their illumination.  Baird also made the world’s first colour broadcast on the 4th of February, 1938, sending a mechanically scanned 120-line image from Baird’s Crystal Palace studios to a projection screen at London’s Dominion Theatre.  Mechanically scanned colour television was also demonstrated by Bell Laboratories in June 1929 using three complete systems of photoelectric cells, amplifiers, glow-tubes, and colour filters, with a series of mirrors to superimpose the red, green, and blue images into one full-colour image.

The first practical hybrid system was again pioneered by John Logie Baird.  In 1940 he publicly demonstrated a colour TV combining a traditional black-and-white display with a rotating coloured disk.  This device was very deep, but was later improved with a mirror folding the light path into an entirely practical device resembling a large conventional console.  However, Baird was unhappy with the design, and, as early as 1944, had commented to a British government committee that a fully electronic device would be better.

In 1939, Hungarian engineer Peter Carl Goldmark introduced an electro-mechanical system while at CBS Broadcasting Inc. (CBS), which contained an Iconoscope sensor.  The CBS field-sequential colour system was partly mechanical, with a disc made of red, blue, and green filters spinning inside the television camera at 1,200 rpm, and a similar disc spinning in synchronisation in front of the cathode ray tube (CRT) inside the receiver set.  The system was first demonstrated to the Federal Communications Commission (FDC) on the 29th of August, 1940, and shown to the press on the 4th of September, 1940. 

CBS began experimental colour field tests using film as early as the 28th of August, 1940, and live cameras by the 12th of November, 1940. The National Broadcasting Company (NBC) (which is owned by Radio Corporation of America (RCA) made its first field test of colour TV on the 20th of February, 1941.  CBS began daily colour field tests on the 1st of June, 1941.  These colour systems were not compatible with existing black-and-white television sets, and, as no colour TV sets were available to the public at this time, viewing of the colour field tests was restricted to RCA and CBS engineers and the invited press.  The War Production Board halted the manufacture of TV and radio equipment for civilian use from the 22nd of April, 1942 to the 20th of August, 1945, limiting any opportunity to introduce colour TV to the general public.

As early as 1940, Baird had started work on a fully electronic system he called Telechrome. Early Telechrome devices used two electron guns aimed at either side of a phosphor plate.  The phosphor was patterned so the electrons from the guns only fell on one side of the patterning or the other.  Using cyan and magenta phosphors, a reasonable limited-colour image could be obtained.  He also demonstrated the same system using monochrome signals to produce a 3D image (called stereoscopic at the time).  A demonstration on the 16th of August.  1944 was the first example of a practical colour TV system.  Work on the Telechrome continued and plans were made to introduce a three-gun version for full colour.  However, Baird’s untimely death in 1946 ended the development of the Telechrome system.  Similar concepts were common through the 1940’s and 1950’s, differing primarily in the way they re-combined the colours generated by the three guns.  The Geer tube was similar to Baird’s concept but used small pyramids with the phosphors deposited on their outside faces, instead of Baird’s 3D patterning on a flat surface.  The Penetron used three layers of phosphor on top of each other and increased the power of the beam to reach the upper layers when drawing those colours.  The Chromatron used a set of focusing wires to select the coloured phosphors arranged in vertical stripes on the tube.

One of the great technical challenges of introducing colour broadcast TV was the desire to conserve bandwidth, potentially three times that of the existing black-and-white standards, and not use an excessive amount of radio spectrum.  In the United States (U.S.), after considerable research, the National Television Systems Committee (NTSC) approved an all-electronic system developed by RCA, which encoded the colour information separately from the brightness information and greatly reduced the resolution of the colour information to conserve bandwidth.  As black-and-white TV’s could receive the same transmission and display it in black-and-white, the colour system adopted is backwards compatible.  Compatible Colour, featured in RCA advertisements of the period, is mentioned in the song America, of West Side Story, 1957.  The bright image remained compatible with existing black-and-white TV sets at slightly reduced resolution, while colour TV’s could decode the extra information in the signal and produce a limited-resolution colour display.  The higher-resolution black-and-white and lower-resolution colour images combine in the brain to produce a seemingly high-resolution colour image.  The NTSC standard represented a major technical achievement.

The first colour broadcast was the first episode of the live program The Marriage on the 8th of July, 1954.  During the following ten years most network broadcasts, and nearly all local programming, continued to be in black-and-white.  It was not until the mid-1960s that colour sets started selling in large numbers, due in part to the colour transition of 1965 in which it was announced that over half of all network prime-time programming would be broadcast in colour that autumn.  The first all-color prime-time season came just one year later.  In 1972, the last holdout among daytime network programs converted to colour, resulting in the first completely all-colour network season.

Early colour sets were either floor-standing console models or tabletop versions nearly as bulky and heavy, so in practice, they remained firmly anchored in one place.  General Electric’s (GE) relatively compact and lightweight Porta-Colour set was introduced in the spring of 1966.  It used a transistor-based ultrahigh-frequency (UHF) tuner.  The first fully transistorised colour television in the United States was the Quasar TV introduced in 1967.   These developments made watching colour television a more flexible and convenient proposition.

In 1972, sales of colour sets finally surpassed sales of black-and-white sets.  Colour broadcasting in Europe was not standardized on the Phase Alternate Line (PAL) format until the 1960’s, and broadcasts did not start until 1967.  By this point, many of the technical issues in the early sets had been worked out, and the spread of colour sets in Europe was fairly rapid.  By the mid-1970’s, the only stations broadcasting in black-and-white were a few high-numbered UHF stations in small markets and a handful of low-power repeater stations in even smaller markets such as vacation spots.  By 1979, even the last of these had converted to colour and, by the early 1980’s, black and white sets had been pushed into niche markets, notably low-power uses, small portable sets, or for use as video monitor screens in lower-cost consumer equipment.  By the late 1980’s even these areas switched to colour sets.

 

Image © Kskhh via Wikipedia

A 40″ Samsung Full HD LED TV.

Image © Denelson83 via Wikipedia and is in the public domain

SMPTE colour bars.

These are used in a test pattern, sometimes when no programme material is available.

Digital Television 

Read more about Digital Television here and here.

Digital television (DTV)  is the transmission of audio and video by digitally processed and multiplexed signals, in contrast to the totally analogue and channel-separated signals used by analogue television (TV).  Due to data compression, digital TV can support more than one programme in the same channel bandwidth.  It is an innovative service that represents the most significant evolution in TV broadcast technology since colour TV emerged in the 1950’s.  Digital TV’s roots have been tied very closely to the availability of inexpensive, high-performance computers.  It was not until the 1990’s that digital TV became possible.  Digital TV was previously not practically possible due to the impractically high bandwidth requirements of uncompressed digital video, requiring around 200 Mbit/s for a standard-definition television (SDTV) signal, and over 1 Gbit/s for high-definition television (HDTV).

A digital TV service was proposed in 1986 by Nippon Telegraph and Telephone (NTT) and the Ministry of Posts and Telecommunication (MPT) in Japan, where there were plans to develop an Integrated Network System service.  However, it was not possible to practically implement such a digital TV service until the adoption of Discrete Cosine Transform (DCT) video compression technology made it possible in the early 1990’s.

In the mid-1980’s, as Japanese consumer electronics firms forged ahead with the development of HDTV technology, the MUSE analogue format proposed by Japan Broadcasting Corporation (also known as NHK), a Japanese company, was seen as a pacesetter that threatened to eclipse United States (U.S.) electronics companies’ technologies.  Until June 1990, the Japanese MUSE standard, based on an analogue system, was the front-runner among the more than 23 other technical concepts under consideration.  Then, a U.S. company, General Instrument, demonstrated the possibility of a digital TV signal.  This breakthrough was of such significance that the Federal Communications Commission (FCC) was persuaded to delay its decision on an Associated Television (ATV) standard until a digitally-based standard could be developed.

In March 1990, when it became clear that a digital standard was possible, the FCC made a number of critical decisions.  First, the Commission declared that the new ATV standard must be more than an enhanced analogue signal, but be able to provide a genuine HDTV signal with at least twice the resolution of existing TV images.  Then, to ensure that viewers who did not wish to buy a new digital TV set could continue to receive conventional TV broadcasts, it dictated that the new ATV standard must be capable of being simulcast on different channels.  The new ATV standard also allowed the new definition television (DTV) signal to be based on entirely new design principles.  Although incompatible with the existing National Television Standards Committee (NTSC) standard, the new DTV standard would be able to incorporate many improvements.

The last standards adopted by the FCC did not require a single standard for scanning formats, aspect ratios, or lines of resolution.  This compromise resulted from a dispute between the consumer electronics industry (joined by some broadcasters) and the computer industry (joined by the film industry and some public interest groups) over which of the two scanning processes (interlaced or progressive) would be best suited for the newer digital HDTV compatible display devices.  Interlaced scanning, which had been specifically designed for older analogue cathode ray tube (CRT) display technologies, scans even-numbered lines first, then odd-numbered ones.  In fact, interlaced scanning can be looked at as the first video compression model as it was partly designed in the 1940’s to double the image resolution to exceed the limitations of the TV broadcast bandwidth.  Another reason for its adoption was to limit the flickering on early CRT screens whose phosphor-coated screens could only retain the image from the electron scanning gun for a relatively short duration.  However, interlaced scanning does not work as efficiently on newer devices such as Liquid-crystal display (LCD), for example, which are better suited to a more frequent progressive refresh rate.

Progressive scanning, the format that the computer industry had long adopted for computer display monitors, scans every line in sequence, from top to bottom.  Progressive scanning in effect doubles the amount of data generated for every full screen displayed in comparison to interlaced scanning by painting the screen in one pass in 1/60-second, instead of two passes in 1/30-second.  The computer industry argued that progressive scanning is superior because it does not flicker on the new standard of display devices in the manner of interlaced scanning.  It also argued that progressive scanning enables easier connections with the Internet, and is more cheaply converted to interlaced formats than vice versa.  The film industry also supported progressive scanning because it offered a more efficient means of converting filmed programming into digital formats.  For their part, the consumer electronics industry and broadcasters argued that interlaced scanning was the only technology that could transmit the highest quality pictures then (and currently) feasible, i.e., 1,080 lines per picture and 1,920 pixels per line.  Broadcasters also favoured interlaced scanning because their vast archive of interlaced programming is not readily compatible with a progressive format.  William F. Schreiber, who was director of the Advanced Television Research Program at the Massachusetts Institute of Technology from 1983 until his retirement in 1990, thought that the continued advocacy of interlaced equipment originated from consumer electronics companies that were trying to get back the substantial investments they made in the interlaced technology.

The digital TV transition started in the late 2000’s.  All governments across the world set the deadline for analogue shutdown by 2010’s.  Initially, the adoption rate was low, as the first digital tuner-equipped TV sets were costly but soon, as the price of digital-capable TV sets dropped, more and more households were converting to digital TV sets. 

Smart Television

Read more about Smart Television here.

The advent of digital television (TV) allowed innovations like smart TV sets.  A smart television, sometimes referred to as a connected TV or hybrid TV, is a TV set or set-top box with integrated Internet and Web 2.0 features, and is an example of technological convergence between computers, television sets and set-top boxes.  Besides the traditional functions of TV sets and set-top boxes provided through traditional Broadcasting media, these devices can also provide Internet TV, online interactive media, over-the-top content, as well as on-demand streaming media, and home networking access.  These TV’s come pre-loaded with an operating system.

Smart TV is not to be confused with Internet TV, Internet Protocol television or Web TV.  Internet television refers to the receiving of television content over the Internet instead of by traditional systems such as terrestrial, cable and satellite (although the Internet itself is received by these methods).  Internet protocol television (IPTV) is one of the emerging Internet television technology standards for use by TV  networks.  Web TV is a term used for programs created by a wide variety of companies and individuals for broadcast on Internet TV.  A first patent was filed in 1994 (and extended the following year) for an intelligent TV system, linked with data processing systems, by means of a digital or analogue network.  Apart from being linked to data networks, one key point is its ability to automatically download necessary software routines, according to a user’s demand, and process their needs.  Major TV manufacturers announced the production of smart TV’s only, for middle-end and high-end TV’s in 2015.   Smart TV’s have gotten more affordable compared to when they were first introduced, with 46 million United States (U.S.) households having at least one as of 2019.

Image © LG via Wikipedia

An LG Smart TV.

3D Television 

Read more about 3D Television here.

3D television (3DTV) conveys depth perception to the viewer by employing techniques such as stereoscopic display, multi-view display, 2D-plus-depth, or any other form of 3D display.  Most modern 3D television (TV) sets use an active shutter 3D system or a polarised 3D system, and some are autostereoscopic without the need for glasses.  Stereoscopic 3D television was demonstrated for the first time on the 10th of August, 1928, by John Logie Baird in his company’s premises at 133 Long Acre, London.  Baird pioneered a variety of 3D television systems using electromechanical and cathode-ray tube (CRT) techniques.  The first 3D TV was produced in 1935.  The advent of digital TV in the 2000’s greatly improved 3D TV sets.  Although 3D TV sets are quite popular for watching 3D home media such as on Blu-ray discs, 3D programming has largely failed to make inroads with the public.  Many 3D TV channels which started in the early 2010’s were shut down by the mid-2010’s.  According to DisplaySearch 3D TV shipments totaled 41.45 million units in 2012, compared with 24.14 in 2011 and 2.26 in 2010.  As of late 2013, the number of 3D TV viewers started to decline.

Broadcast Systems

Terrestrial Television

Read more about Terrestrial Television here and here.

Programming is broadcast by television (TV) stations, sometimes called channels, as stations are licensed by their governments to broadcast only over assigned channels in the TV band.  At first, terrestrial broadcasting was the only way TV could be widely distributed, and because bandwidth was limited, i.e., there were only a small number of channels available, government regulation was the norm.  In the United States (U.S.), the Federal Communications Commission (FCC) allowed stations to broadcast advertisements beginning in July 1941 but required public service programming commitments as a requirement for a license.  By contrast, the United Kingdom (U.K.) chose a different route, imposing a TV license fee on owners of TV reception equipment to fund the British Broadcasting Corporation (BBC) which had public service as part of its Royal Charter.

WRGB claims to be the world’s oldest TV station, tracing its roots to an experimental station founded on the 13th of January, 1928, broadcasting from the General Electric (G.E.) factory in Schenectady, New York, U.S.  under the call letters W2XB.  It was popularly known as WGY Television after its sister radio station.  Later in 1928, G.E. started a second facility, this one in New York City, which had the call letters W2XBS and which today is known as WNBC.  The two stations were experimental in nature and had no regular programming, as receivers were operated by engineers within the company.  The image of a Felix the Cat doll rotating on a turntable was broadcast for two hours every day for several years as new technology was being tested by the engineers.  On the 2nd of November 1936, the BBC began transmitting the world’s first public regular high-definition service from the Victorian Alexandra Palace in north London.   It therefore claims to be the birthplace of TV broadcasting as we now know it.

With the widespread adoption of cable across the U.S. in the 1970’s and 1980’s, terrestrial TV broadcasts have been in decline.  In 2013 it was estimated that about 7% of U.S. households used an antenna.  A slight increase in use began around 2010 due to the switchover to digital terrestrial TV broadcasts, which offered pristine image quality over very large areas and offered an alternative to cable TV (CATV) for cord-cutters.  All other countries around the world are also in the process of either shutting down analogue terrestrial TV or switching over to digital terrestrial TV.

Image © Tennen-Gas via Wikipedia

A modern high-gain UHF Yagi television antenna.

This antenna is used for UHF HDTV reception.  The antenna’s main lobe is off the right end of the antenna and it is most sensitive to stations in that direction.  Each of the metal crossbars along the antenna support boom is called an element, which acts as a half-wave dipole resonator for the radio waves.  The antenna has one driven element which is attached to the TV and it is behind the black box.  The black box is a preamplifier which increases the power of the TV signal before it is sent to the TV set.  The 17 elements to the right of the driven element are called directors.  They reinforce the signal.   The 4 elements on the V-shaped boom are called a corner reflector and they serve to reflect the signal back toward the driven element. 

Yagi HDTV antennas use a corner reflector to increase the bandwidth of the antenna.  The rest of the antenna increases the gain at higher channels, while the corner reflector increases the gain at lower channels.

Cable Television

Read more about Cable Television here and here.

Cable television (CATV) is a system of broadcasting television (TV) programming to paying subscribers via radio frequency (RF) signals transmitted through coaxial cables or light pulses through fibre-optic cables.  This contrasts with traditional terrestrial TV, in which the TV signal is transmitted over the air by radio waves and received by a television antenna attached to the TV.  In the 2000’s, frequency modulation (FM) radio programming, high-speed Internet, telephone service, and similar non-television services may also be provided through these cables.  The abbreviation CATV is used for cable television in the United States (U.S.).   It originally stood for Community Access Television or Community Antenna Television, from cable television’s origins in 1948, in areas where over-the-air reception was limited by distance from transmitters or mountainous terrain, large community antennas were constructed, and cable was run from them to individual homes.

Image © Peter Trieb via Wikipedia and is in the public domain

Coaxial cable.

This cable is used to carry cable television signals into cathode-ray tubes and flat-panel TV sets.

Satellite Television

Read more about Satellite Television here.

Satellite television is a system of supplying television (TV) programming using broadcast signals relayed from communication satellites.  The signals are received via an outdoor parabolic reflector antenna usually referred to as a satellite dish and a low-noise block downconverter.  A satellite receiver then decodes the desired TV program for viewing on a television set.  Receivers can be external set-top boxes or a built-in TV tuner.  Satellite TV provides a wide range of channels and services, especially to geographic areas without terrestrial TV or cable TV (CATV).

The most common method of reception is direct-broadcast satellite TV, also known as direct-to-home.  In  direct-broadcast satellite television  (DBSTV) systems, signals are relayed from a direct broadcast satellite on the Ku wavelength and are completely digital.  Satellite TV systems formerly used systems known as TV receive-only.  These systems received analogue signals transmitted in the C-band spectrum from fixed-satellite service (FSS) type satellites and required the use of large dishes.  Consequently, these systems were nicknamed big dish systems and were more expensive and less popular.

The direct-broadcast satellite (DBS) TV signals were earlier analogue signals and later digital signals, both of which require a compatible receiver.  Digital signals may include high-definition television (HDTV).  Some transmissions and channels are free-to-air or free-to-view, while many other channels are pay-for television requiring a subscription.  In 1945, British science fiction writer Arthur C. Clarke proposed a worldwide communications system which would function by means of three satellites equally spaced apart in Earth’s orbit.  This was published in the October 1945 issue of the Wireless World magazine and won him the Franklin Institute’s Stuart Ballantine Medal in 1963.

The first satellite TV signals from Europe to North America were relayed via the Telstar satellite over the Atlantic Ocean on the 23rd of July. 1962.  The signals were received and broadcast in North American and European countries and watched by over 100 million.  Launched in 1962, the Relay 1 satellite was the first satellite to transmit TV signals from the U.S. to Japan.  The first geosynchronous communication satellite, Syncom 2, was launched on the 26th of July 1963.

The world’s first commercial communications satellite, called Intelsat I nicknamed Early Bird, was launched into geosynchronous orbit on the 6th of April. 1965.  The first national network of TV satellites, called Orbita, was created by the Soviet Union in October 1967 and was based on the principle of using the highly elliptical Molniya satellite for rebroadcasting and delivering television signals to ground downlink stations.  The first commercial North American satellite to carry TV transmissions was Canada’s geostationary Anik 1, which was launched on the 9th of November, 1972.  ATS-6, the world’s first experimental educational and Direct Broadcast Satellite, was launched on the 30th of May, 1974.   It transmitted at 860 MHz using wideband frequency modulation (FM) and had two sound channels.  The transmissions were focused on the Indian subcontinent but experimenters were able to receive the signal in Western Europe using home-constructed equipment that drew on Ultra high frequency  (UHF) television design techniques already in use.

The first in a series of Soviet geostationary satellites to carry Direct-To-Home television, Ekran 1, was launched on the 26th of October, 1976.  It used a 714 MHz UHF downlink frequency so that the transmissions could be received with existing UHF television technology rather than microwave technology.

Image © Brian Katt via Wikipedia

DBS satellite dishes.

These Dishes are installed on an apartment complex in San Jose, California,  U.S.A.

Internet Television

Read more about Internet Television here.

Internet television (or online television) is the digital distribution of television (TV) content via the Internet as opposed to traditional systems like terrestrial, cable, and satellite, although the Internet itself is received by terrestrial, cable, or satellite methods.  Internet television is a general term that covers the delivery of television series, and other video content, over the Internet by video streaming technology, typically by major traditional television broadcasters.  Internet television should not be confused with Smart TV, Internet Protocol Television (IPTV) or Web TV.  Smart television refers to a television set which has a built-in operating system.  IPTV is one of the emerging Internet television technology standards for use by television networks.  Web television is a term used for programs created by a wide variety of companies and individuals for broadcast on Internet television.

Television Sets

Read more about Television Sets here.

A television set, also called a television receiver, television (TV), TV set, or telly, is a device that combines a tuner, display, amplifier, and speakers for the purpose of viewing television and hearing its audio components.  Introduced in the late 1920’s in mechanical form, television sets became a popular consumer product after World War II in electronic form, using cathode-ray tubes (CRT).  The addition of colour to broadcast television after 1953 further increased the popularity of TV sets and an outdoor antenna became a common feature of suburban homes. The ubiquitous TV set became the display device for recorded media in the 1970’s, such as Betamax and Video Home System (VHS), which enabled viewers to record TV shows and watch prerecorded movies.  In the subsequent decades, TV sets were used to watch digital versatile discs (DVD) and Blu-ray Discs of movies and other content.  Major TV manufacturers announced the discontinuation of CRT, Digital Light Processing (DLP), plasma and fluorescent-backlit liquid-crystal displays (LCD) by the mid-2010’s.  Telly’s since 2010’s mostly used light-emitting diodes (LED).  These are expected to be gradually replaced by organic light-emitting diodes (OLED) in the near future.

Image © Fletcher6 via Wikipedia

An RCA Model 630-TS Television.

The RCA 630-TS was the first mass-produced television set.  It was sold in 1946 – 1947.

Display Technologies

Read more about Display Technologies here.

Disk

Read more about Disk here.

The earliest systems employed a spinning disk to create and reproduce images.  These usually had a low resolution and screen size and never became popular with the public.

CRT

Read more about CRT here.

The cathode-ray tube (CRT) is a vacuum tube used in a television (TV) containing one or more electron guns (a source of electrons or electron emitter) and a fluorescent screen used to view images.  It has a means to accelerate and deflect electron beams onto the screen to create the images.  The images may represent electrical waveforms (oscilloscope), pictures (tv, computer monitor), radar targets or others.  The cathode ray tube (CRT) uses an evacuated glass envelope which is large, deep (i.e. long from front screen face to rear end), fairly heavy, and relatively fragile.  As a matter of safety, the face is typically made of thick lead glass so as to be highly shatter-resistant and to block most X-ray emissions, particularly if the CRT is used in a consumer product.

In television sets and computer monitors, the entire front area of the tube is scanned repetitively and systematically in a fixed pattern called a raster.  An image is produced by controlling the intensity of each of the three electron beams, one for each additive primary colour (red, green, and blue) with a video signal as a reference.  In all modern C.R.T. monitors and televisions, the beams are bent by magnetic deflection, a varying magnetic field generated by coils and driven by electronic circuits around the neck of the tube, although electrostatic deflection is commonly used in oscilloscopes, a type of diagnostic instrument.

A 14″ cathode-ray tube.

This LG.Philips cathode-ray tubes show their deflection coils and electron guns.

DLP

Image © Blue tooth7 via Wikipedia

Read more about DLP here.

Digital Light Processing (DLP) is a type of video projector technology that uses a digital micromirror device.  Some DLP’s have a television (TV) tuner, which makes them a type of TV display.  It was originally developed in 1987 by Dr. Larry Hornbeck of Texas Instruments.  While the  Digital Light Processing (DLP) imaging device was invented by Texas Instruments, the first DLP-based projector was introduced by Digital Projection Ltd in 1997.  Digital Projection and Texas Instruments were both awarded Emmy Awards in 1998 for the invention of the DLP projector technology.  DLP is used in a variety of display applications from traditional static displays to interactive displays and non-traditional embedded applications including medical, security, and industrial uses.  DLP technology is used in DLP front projectors (standalone projection units for classrooms and businesses primarily), but also in private homes.  In these cases, the image is projected onto a projection screen.  DLP is also used in DLP rear projection TV sets and digital signs.  It is also used in about 85% of digital cinema projection.

Image © Dave Pape via Wikipedia and is in the public domain

A Christie Mirage 5000 DLP projector.

This projector made by Christie is circa 2001.  It was one of four being used in the CAVE virtual reality system at EVL in Chicago, U.S.A. and was capable of 120 Hz field-sequential stereo at 1280×1024 resolution, with 5000 lumens brightness.

Plasma

Read more about Plasma here.

A plasma display panel (PDP) is a type of flat panel display common to large television (TV) displays 30 inches (76 cm) or larger.  They are called plasma displays because the technology uses small cells containing electrically charged ionised gases, or what are in essence chambers more commonly known as fluorescent lamps.

LCD

Read more about LCD here.

Liquid-crystal-display (LCD) televisions are television (TV) sets that use LCD display technology to produce images.  LCD TV’s are much thinner and lighter than cathode-ray tubes (CRT) of similar display size and are available in much larger sizes (e.g., 90-inch diagonal).  When manufacturing costs fell, this combination of features made LCD’s practical for TV receivers.  LCD’s come in two types, those using cold cathode fluorescent lamps, simply called LCD’s and those using light-emitting diodes (LED) as a backlight called LED’s.

In 2007, LCD TV sets surpassed sales of CRT-based TV sets worldwide for the first time, and their sales figures relative to other technologies accelerated.  LCD TV sets have quickly displaced the only major competitors in the large-screen market, the Plasma display panel and rear-projection TV.  In mid-2010’s LCD’s especially LED’s became, by far, the most widely produced and sold TV display type.  LCD’s also have disadvantages.  Other technologies address these weaknesses, including organic light-emitting diode (OLED), field emission display (FED) and surface-conduction electron-emitter display (SED) TV’s, but as of 2014 none of these have entered widespread production.

OLED

Read more about OLED here.

An organic light-emitting diode (OLED) is a light-emitting diode in which the emissive electroluminescent layer is a film of organic compound which emits light in response to an electric current.  This layer of organic semiconductor is situated between two electrodes.  Generally, at least one of these electrodes is transparent.  OLED’s are used to create digital displays in devices such as television (TV) screens.  It is also used for computer monitors, and portable systems such as mobile phones, handheld game consoles and personal digital assistants (PDA).

There are two main groups of OLED, those based on small molecules and those employing polymers.  Adding mobile ions to an OLED creates a light-emitting electrochemical cell (LEC), which has a slightly different mode of operation.  OLED displays can use either passive-matrix or active-matrix addressing schemes.  Active-matrix OLED’s require a thin-film transistor backplane to switch each individual pixel on or off but allow for higher resolution and larger display sizes.

An OLED display works without a backlight.  Thus, it can display deep black levels and can be thinner and lighter than a liquid crystal display (LCD).  In low ambient light conditions such as a dark room, an OLED screen can achieve a higher contrast ratio than an LCD, whether it uses cold cathode fluorescent lamps or a light-emitting diode (LED) backlight.  OLED’s are expected to replace other forms of display in the near future.

Image © LG via Wikipedia

An LG 3D OLED TV.

Display Resolution

LDTV

Read more about LDTV here.

Low-definition television (LDTV) refers to television (TV) systems that have a lower screen resolution than standard-definition TV systems such 240p (320*240).  It is used in handheld tellies.  The most common source of LDTV programming is the Internet, where mass distribution of higher-resolution video files could overwhelm computer servers and take too long to download.  Many mobile phones and portable devices such as Apple’s iPod Nano, or Sony’s PlayStation Portable use LDTV video, as higher-resolution files would be excessive to the needs of their small screens (320×240 and 480×272 pixels respectively).  The current generation of iPod Nanos has LDTV screens, as do the first three generations of iPod Touch and iPhone (480×320).  For the first years of its existence, YouTube offered only one, low-definition (LD) resolution of 320x240p at 30fps or less.  A standard, consumer-grade videotape can be considered a standard-definition television (SDTV) due to its resolution (approximately 360 × 480i/576i).

Image © Libron via Wikipedia and is in the public domain

A comparison of 8K UHDTV, 4K UHDTV, HDTV and SDTV resolution.

SDTV

Read more about SDTV here.

Standard-definition television (SDTV) refers to two different resolutions, 576i, with 576 interlaced lines of resolution, derived from the European-developed Phase Alternating Line (PAL) and Sequentiel de couleur a memoir (french for colour sequential with memory) (SECAM) systems, and 480i based on the American National Television System Committee (NTSC) system.  SDTV is a television (TV) system that uses a resolution that is not considered to be either high-definition television (HDTV) (720p, 1080i, 1080p, 1440p, 4K ultra high-definition television (UHDTV), and 8K ultra-high definition (UHD) or enhanced-definition television (EDT.V 480p).  In North America, digital SDTV is broadcast in the same 4:3 aspect ratio as National Television Standards Committee  (NTSC) signals with widescreen content being centre cut.  However, in other parts of the world that used the PAL or SECAM colour systems, SDTV is now usually shown with a 16:9 aspect ratio, with the transition occurring between the mid-1990’s and mid-2000’s.  Older programs with a 4:3 aspect ratio are shown in the United States (U.S.) as 4:3 with non-Advanced Television Systems Committee (ATSC) countries preferring to reduce the horizontal resolution by anamorphically scaling a pillarboxed image.

HDTV

Read more about HDTV here

High-definition television (HDTV) provides a resolution that is substantially higher than that of standard-definition television (SDTV).

HDTV may be transmitted in various formats:

1080p: 1920×1080p: 2,073,600 pixels (~2.07 megapixels) per frame.

1080i: 1920×1080i: 1,036,800 pixels (~1.04 MP) per field or 2,073,600 pixels (~2.07 MP) per frame.

A non-standard CEA resolution exists in some countries such as 1440×1080i: 777,600 pixels (~0.78 MP) per field or 1,555,200 pixels (~1.56 MP) per frame.

720p: 1280×720p: 921,600 pixels (~0.92 MP) per frame.

UHDTV

Read more about UHDTV here.

Ultra-high-definition television (UHDTV), also known as Super Hi-Vision,  UltraHD or UHD  includes 4K UHD (2160p) and 8K ultra-high definition (UHD) (4320p), which are two digital video formats proposed by NHK Science & Technology Research Laboratories and defined and approved by the International Telecommunication Union (ITU). The Consumer Electronics Association (CTA) announced on the 17th of October, 2012, that UHD, or Ultra HD, would be used for displays that have an aspect ratio of at least 16:9 and at least one digital input capable of carrying and presenting natural video at a minimum resolution of 3840×2160 pixels.

Content

Television Programming

Read more about Television Programming here, here and here.

Getting television (TV) programming shown to the public can happen in many other ways.  After production, the next step is to market and deliver the product to whichever markets are open to using it.  This typically happens on two levels:

Original run or First run (a producer creates a programme of one or multiple episodes and shows it on a station or network which has either paid for the production itself or to which a license has been granted by the TV producers to do the same).

Broadcast syndication  (this is the terminology rather broadly used to describe secondary programming usages i.e. beyond its original run.  It includes secondary runs in the country of the first issue, but also international usage which may not be managed by the originating producer.  In many cases, other companies, TV stations, or individuals are engaged to do the syndication work, in other words, to sell the product into the markets they are allowed to sell into by contract from the copyright holders, in most cases the producers).

First-run programming is increasing on subscription services outside of the United States (U.S.), but few domestically produced programs are syndicated on domestic free-to-air (FTA) elsewhere.  This practice is increasing, however, generally on digital-only FTA channels or with subscriber-only, first-run material appearing on FTA.  Unlike the U.S., repeat FTA screenings of an FTA network program usually only occur on that network.  Also, affiliates rarely buy or produce non-network programming that is not focused on local programming.

Television Genres

Television (TV)  genres include a broad range of programming types that entertain, inform, and educate viewers.  The most expensive entertainment genres to produce are usually dramas and dramatic miniseries.  However, other genres, such as historical Western genres, may also have high production costs.

Pop culture entertainment genres include action-oriented shows such as police, crime, detective dramas, horror, or thriller shows.  As well, there are also other variants of the drama genre, such as medical dramas and daytime soap operas.  Sci-fi series can fall into either the drama or action category, depending on whether they emphasise philosophical questions or high adventure.  Comedy is a popular genre which includes situation comedy (sitcom) and animated series for the adult demographic such as Comedy Central’s South Park.

The least expensive forms of entertainment programming genres are game shows, talk shows, variety shows, and reality TV.  Game shows feature contestants answering questions and solving puzzles to win prizes.  Talk shows contain interviews with film, TV, music and sports celebrities and public figures.  Variety shows feature a range of musical performers and other entertainers, such as comedians and magicians, introduced by a host or Master of Ceremonies.  There is some crossover between some talk shows and variety shows because leading talk shows often feature performances by bands, singers, comedians, and other performers in between the interview segments.  Reality TV series regular people (i.e., not actors) facing unusual challenges or experiences ranging from arrest by police officers to significant weight loss.  A derived version of reality shows depicts celebrities doing mundane activities such as going about their everyday life or doing regular jobs. 

Fictional TV programmes that some telly scholars and broadcasting advocacy groups argue are quality TV programmes include series such as The Sopranos.  Kristin Thompson argues that some of these television series exhibit traits also found in art films, such as psychological realism, narrative complexity, and ambiguous plot lines.  Nonfiction TV programmes that some telly scholars and broadcasting advocacy groups argue are quality television programmes, include a range of serious, noncommercial, programming aimed at a niche audience, such as documentaries and public affairs shows. 

Television Funding

Around the world, broadcast television (TV) is financed by government, advertising, licensing (a form of tax), subscription, or any combination of these.  To protect revenues, subscription TV channels are usually encrypted to ensure that only subscribers receive the decryption codes to see the signal.  Unencrypted channels are known as free-to-air (FTA).  In 2009, the global TV market represented 1,217.2 million TV households with at least one TV and total revenues of 268.9 billion EUR (declining 1.2% compared to 2008).  North America had the biggest TV revenue market share with 39% followed by Europe (31%), Asia-Pacific (21%), Latin America (8%), and Africa and the Middle East (2%).  Globally, the different TV revenue sources are divided into 45–50% TV advertising revenues, 40–45% subscription fees and 10% public funding.

Television Advertising

Read more about Television advertising here

Television’s broad reach makes it a powerful and attractive medium for advertisers. Many television (TV) networks and stations sell blocks of broadcast time to advertisers (sponsors) to fund their programming.  Television advertisements (also called a TV commercial, commercial, ad and an advert) is a span of TV programming produced and paid for by an organisation, which conveys a message, typically to market a product or service.  Advertising revenue provides a significant portion of the funding for most privately owned TV networks.  The vast majority of TV ads today consist of brief advertising spots, ranging in length from a few seconds to several minutes (as well as programme-length infomercials).  Adverts of this sort have been used to promote a wide variety of goods, services and ideas since the beginning of TV.

The effects of TV advertising upon the viewing public (and the effects of mass media in general) have been the subject of discourse by philosophers including Marshall McLuhan.  The viewership of TV programming, as measured by companies such as Nielsen Media Research, is often used as a metric for TV  advertisement placement, and consequently, for the rates charged to advertisers to air within a given network, television programme, or time of day (called a daypart).  In many countries, including the United States (U.S.), TV campaign advertisements are considered indispensable for a political campaign.  In other countries, such as France, political advertising on the telly is heavily restricted, while some countries, such as Norway, completely ban political adverts.

The first official, paid television ad was broadcast in the U.S. on the 1st of July, 1941, over New York station WNBT (now WNBC) before a baseball game between the Brooklyn Dodgers and Philadelphia Phillies.  The announcement for Bulova watches, for which the company paid anywhere from $4.00 to $9.00 (reports vary), displayed a WNBT test pattern modified to look like a clock with the hands showing the time.  The Bulova logo, with the phrase Bulova Watch Time, was shown in the lower right-hand quadrant of the test pattern while the second hand swept around the dial for one minute.  The first TV ad broadcast in the United Kingdom (U.K.) was on ITV on the 22nd of September, 1955, advertising Gibbs SR toothpaste.  The first TV ad broadcast in Asia was on Nippon Television in Tokyo on the 28th of August, 1953, advertising Seikosha (now Seiko), which also displayed a clock with the current time.

Image via Swtpc6800 on Wikipedia and is in the public domain

Radio News cover, September, 1928.

Television was still in its experimental phase in 1928, but the medium’s potential to sell goods was already predicted.  It was seen as the ideal television of the future but these early experimental televisions could not maintain synchronisation with the camera.  The viewer had to constantly make adjustments as seen by the sync control in the man’s hand.  

United Kingdom

The television (TV) regulator oversees TV advertising in the United Kingdom (U.K.).  Its restrictions have applied since the early days of commercially funded TV.  Despite this, an early TV mogul, Roy Thomson, likened the broadcasting licence to being a licence to print money.  Restrictions mean that the big three national commercial TV channels ITV, Channel 4, and Channel 5 can show an average of only seven minutes of advertising per hour (eight minutes in the peak period).  Other broadcasters must average no more than nine minutes (twelve in the peak).  This means that many imported TV shows from the United States (U.S.) have unnatural pauses where a British company does not use the narrative breaks intended for more frequent U.S. advertising.  Advertisements must not be inserted in the course of certain specific proscribed types of programmes which last less than half an hour in scheduled duration.  This list includes any news or current affairs programmes, documentaries, and programmes for children.  Additionally, ads may not be carried in a programme designed and broadcast for reception in schools in any religious broadcasting service or other devotional program or during a formal Royal ceremony or occasion.  There also must be clear demarcations in time between the programmes and the adverts.  The British Broadcasting Company (BBC), being strictly non-commercial, is not allowed to show advertisements on TV in the U.K., although it has many advertising-funded channels abroad.  The majority of its budget comes from TV license fees and broadcast syndication, the sale of content to other broadcasters.

United States

Since its inception in the United States (U.S.) in 1941, television (TV) commercials have become one of the most effective, persuasive, and popular methods of selling products of many sorts, especially consumer goods.  During the 1940’s and into the 1950’s, programmes were hosted by single advertisers.  This, in turn, gave great creative control to the advertisers over the content of the show.  Perhaps due to the quiz show scandals in the 1950’s, networks shifted to the magazine concept, introducing advertising breaks with other advertisers.

U.S. advertising rates are determined primarily by Nielsen ratings.  The time of the day and popularity of the channel determine how much a TV commercial can cost.  For example, it can cost approximately $750,000 for a 30-second block of commercial time during the highly popular singing competition American Idol, while the same amount of time for the Super Bowl can cost several million dollars. Conversely, lesser-viewed time slots, such as early mornings and weekday afternoons, are often sold in bulk to producers of infomercials at far lower rates.  In recent years, the paid programme or infomercial has become common, usually in lengths of 30 minutes or one hour.  Some drug companies and other businesses have even created news items for broadcast, known in the industry as video news releases, paying programme directors to use them.

Some TV programmes also deliberately place products into their shows as advertisements, a practice started in feature films and is known as product placement.  For example, a character could be drinking a certain kind of pop, going to a particular chain restaurant, or driving a certain make of car.  This is sometimes very subtle, with shows having vehicles provided by manufacturers for low cost in exchange for product placement.  Sometimes, a specific brand or trade mark, or music from a certain artist or group, is used.   This excludes guest appearances by artists who perform on the show.

Ireland

Broadcast advertising is regulated by the Broadcasting Authority of Ireland.

Subscription 

Some television (TV) channels are partly funded from subscriptions, therefore, the signals are encrypted during the broadcast to ensure that only the paying subscribers have access to the decryption codes to watch pay television or speciality channels.  Most subscription services are also funded by advertising.

Taxation Or License

Television (TV) services in some countries may be funded by a TV licence or a form of taxation, which means that advertising plays a lesser role or no role at all.  For example, some channels may carry no advertising at all and some very little, including:

Australia (ABC Television).

Belgium (VRT for Flanders and RTBF for Wallonia).

Denmark (DR).

Ireland (RTE).

Japan (NHK).

Norway (NRK).

Sweden (SVT).

Switzerland (SRG SSR).

Republic of China (Taiwan) (PTS).

United Kingdom (BBC).

United States (PBS).

Broadcast Programming

Read more about Broadcast Programming here and here.

Broadcast programming, or television (TV) listings in the United Kingdom (U.K.), is the practice of organising TV programmes in a schedule, with broadcast automation used to regularly change the scheduling of TV programmes to build an audience for a new show, retain that audience, or compete with other broadcasters’ programmes.

See Also

Blog Posts

Notes And Links

Article source: Wikipedia and is subject to change.

Max Rahubovskiy on Pexels –  The image shown at the top of this page is the copyright of Max Rahubovskiy.  You can find more great work from the photographer Max by clicking the link above and you can get lots more free stock photos at Pexels.

The image above of Flat-screen televisions in 2008 is the copyright of Wikipedia user Wags05.  It is in the Public Domain. 

The image above of the Nipkow Disk is the copyright of Wikipedia user Hzeller.   It comes with a Creative Commons licence (CC BY-SA 3.0).  

The image above of John Ferdinand Braun is copyright unknown and is in the Public Domain.

The image above of Vladimir Zworykin is copyright unknown and is in the Public Domain.

The image above of Manfred von Ardenne in 1933 is the copyright of unknown.   It comes with a Creative Commons licence (CC BY-SA 3.0).  

The image above of A Radio Corporation Of America 1939 Advertisement is copyright unknown and is in the Public Domain.

The image above of A 40″ Samsung Full HD LED TV is the copyright of Wikipedia user Kskhh.   It comes with a Creative Commons licence (CC BY-SA 4.0).  

The image above of SMPTE colour bars is the copyright of Wikipedia user Denelson83.  It is in the Public Domain. You can see more of his/her great work here.

The image above of an LG Smart TV is the copyright of Wikipedia user LG.  It comes with a Creative Commons licence (CC BY-SA 2.0).  You can see more of their great work here.

The image above of A modern high-gain UHF Yagi television antenna is the copyright of Wikipedia user Tennen-Gas.   It comes with a Creative Commons licence (CC BY-SA 3.0).  

The image above of Coaxial cable is the copyright of Wikipedia user Peter Trieb.  It is in the Public Domain. 

The image above of DBS satellite dishes is the copyright of Wikipedia user Brian Katt.   It comes with a Creative Commons licence (CC BY-SA 3.0).  You see more of his great work here.

The image above of an RCA Model 630-TS television is the copyright of Wikipedia user Fletcher6.   It comes with a Creative Commons licence (CC BY-SA 3.0).  You see more of his/her work here.

The image above of a 14″ cathode-ray tube is the copyright of Wikipedia user Blue tooth7.   It comes with a Creative Commons licence (CC BY-SA 3.0)

The image above of a Christie Mirage 5000 DLP projector is the copyright of Wikipedia user Dave Pape.  It is in the Public Domain. You can see more of his great work here.

The image above of an LG 3D OLED TV is the copyright of Wikipedia user LG.  It comes with a Creative Commons licence (CC BY-SA 2.0).  You can see more of their great work here.

The image above of a comparison of 8K UHDTV, 4K UHDTV, HDTV and SDTV resolution is the copyright of Wikipedia user Libron.  It is in the Public Domain.   

The image above of the Radio News cover, September, 1928 is provided by Wikipedia user Swtpc6800 and is in the Public Domain.

Horror

Image © of Alexa_Fotos via Pixabay

What is there not to like about horror? It is an escapism from the real world and so damn cool.  I love so much about it.  This page concentrates on the Horror genre and anything I post about that can be seen in Blog Posts below.

I have been a fan of Horror, particularly Horror films since I was little.  I have loved Universal classic monsters, for it is they that started my love of Horror off, even if they scared the hell out of me at first and I hid under my Mom’s arm or behind the settee at first watching them., ha ha.  That changed the older I got. 

If you mention anything to do with horror then it is inevitable Halloween is mentioned. 

Growing up in England from a child to a teenager in the 1960’s, 1970’s and 1980’s, Halloween was an American thing you saw on the telly.  There was no dressing up and trick-or-treating, not in my family home anyway.  Even when my kids were younger I never really bothered much about Halloween.  It was just all too American for me and just liked the English traditions I was brought up with.  They had fun wearing masks, bobbing for apples etc. but we never went out dressed up knocking on people’s doors.  in fact, I don’t recall ever seeing anyone else do it either. 

Nowadays all of the above is a common sight.  I am no killjoy and I don’t knock anyone who really enjoys it.  I admit it’s a fun thing for kids to do and a good excuse for a party for the adults which I have enjoyed going to in the past few years.  When you have suffered from depression and anxiety for as long as I have, just to be included can be a lifesaver.

The main thing I like about Halloween is dressing up and the Horror theme to it.  I have never celebrated  Halloween in my life in the past because, since I was a kid, I have loved horror.  Every day is Halloween for me, ha ha. 

About Horror 

Horror is a genre of fiction that is intended to disturb, frighten or scare. Horror is often divided into the sub-genres of psychological horror and supernatural horror, which are in the realm of speculative fiction.  Literary historian J. A. Cuddon, in 1984, defined the horror story as “a piece of fiction in prose of variable length… which shocks, or even frightens the reader, or perhaps induces a feeling of repulsion or loathing”.  Horror intends to create an eerie and frightening atmosphere for the reader.  Often the central menace of a work of horror fiction can be interpreted as a metaphor for larger fears of a society.

Prevalent elements include ghosts, demons, vampires, monsters, zombies, werewolves, the Devil, serial killers, extraterrestrial life, killer toys, psychopaths, gore, torture, evil clowns, cults, cannibalism, vicious animals, the apocalypse, evil witches, dystopia and man-made or natural disasters. 

Image by Gustave Dore via wikipedia and is in the public domain

The Raven by Gustave Dore.

This is an illustration of the 1884 edition of Edgar Allan Poe’s The Raven.  It is referring to the illustration “Doubting, dreaming dreams no mortal ever dared to dream before.”

The History Of Horror 

Before 1000

The horror genre has ancient origins, with roots in folklore and religious traditions focusing on death, the afterlife, evil, the demonic and the principle of the thing embodied in the person.  These manifested in stories of beings such as demons, witches, vampires, werewolves and ghosts.  European horror fiction became established through the works of the Ancient Greeks and Ancient Romans.  Mary Shelley’s well-known 1818 novel about Frankenstein was greatly influenced by the story of Hippolytus, whom Asclepius revives from death.  Euripides wrote plays based on the story, Hippolytos Kalyptomenos and Hippolytus.  In Plutarch’s The Lives of the Noble Grecians and Romans in the account of Cimon, the author describes the spirit of a murderer, Damon, who himself was murdered in a bathhouse in Chaeronea.

Pliny the Younger (61 to circa 113) tells the tale of Athenodorus Cananites, who bought a haunted house in Athens.  Athenodorus was cautious since the house seemed inexpensive.  While writing a book on philosophy, he was visited by a ghostly figure bound in chains.  The figure disappeared in the courtyard and the following day, the magistrates dug in the courtyard and found an unmarked grave.

Elements of the horror genre also occur in Biblical texts, notably in the Book of Revelation.

After 1000

The Witch of Berkeley by William of Malmesbury has been viewed as an early horror story.  Werewolf stories were popular in medieval French literature. One of Marie de France’s twelve lais is a werewolf story titled Bisclavret.

The Countess Yolande commissioned a werewolf story titled Guillaume de Palerme.  Anonymous writers penned two werewolf stories, Biclarel and Melion.

Much horror fiction derives from the cruellest personages of the 15th century.  Dracula can be traced to the Prince of Wallachia Vlad III, whose alleged war crimes were published in German pamphlets.  A 1499 pamphlet was published by Markus Ayrer, which is most notable for its woodcut imagery.  The alleged serial killer sprees of Gilles de Rais have been seen as the inspiration for Bluebeard.  The motif of the vampiress is most notably derived from the real-life noblewoman and murderer, Elizabeth Bathory, and helped usher in the emergence of horror fiction in the 18th century, such as through Laszlo Turoczi’s 1729 book Tragica Historia.

Image by unknown via wikipedia and is in the public domain

Vlad The Impaler.

This is a portrait of Vlad Tzepesh (Vlad III).  He was the inspiration for Count Dracula.  Tzepesh ruled from 1455 – 1462 and 1483 – 1496.

18th Century

The 18th century saw the gradual development of Romanticism and the Gothic horror genre.  It drew on the written and material heritage of the Late Middle Ages, finding its form with Horace Walpole’s seminal and controversial 1764 novel, The Castle of Otranto.  In fact, the first edition was published disguised as an actual medieval romance from Italy, discovered and republished by a fictitious translator.  Once revealed as modern, many found it anachronistic, reactionary, or simply in poor taste but it proved immediately popular.  Otranto inspired Vathek (1786) by William Beckford, A Sicilian Romance (1790), The Mysteries of Udolpho (1794) and The Italian (1796) by Ann Radcliffe and The Monk (1797) by Matthew LewisA significant amount of horror fiction of this era was written by women and marketed towards a female audience, a typical scenario of the novels being a resourceful female menaced in a gloomy castle.

Image by Joshua Reynolds via wikipedia and is in the public domain

Horace Walpole by Joshua Reynolds.

Image by Henry Justice Ford via wikipedia and is in the public domain

Athenodorus by Henry Justice Ford.

Here Athenodorus confronts the Spectre.  It is from The Strange Story Book by Leonora Blanche Lang and Andrew Lang.

19th Century

The Gothic tradition blossomed into the genre that modern readers today call horror literature in the 19th century.  Influential works and characters that continue resonating in fiction and film today saw their genesis in the Brothers Grimm’s Hansel and Gretel (1812), Mary Shelley’s Frankenstein (1818), John Polidori’s The Vampyre (1819), Charles Maturin’s Melmoth the Wanderer (1820), Washington Irving’s The Legend of Sleepy Hollow (1820), Jane C. Loudon’s The Mummy!: Or a Tale of the Twenty-Second Century (1827), Victor Hugo’s The Hunchback of Notre Dame (1831), Thomas Peckett Prest’s Varney the Vampire (1847), the works of Edgar Allan Poe, the works of Sheridan Le Fanu, Robert Louis Stevenson’s Strange Case of Dr Jekyll and Mr Hyde (1886), Oscar Wilde’s The Picture of Dorian Gray (1890), H. G. Wells’ The Invisible Man (1897), and Bram Stoker’s Dracula (1897).  Each of these works created an enduring icon of horror seen in later re-imaginings on the page, stage and screen.

Image by Richard Rothwell via wikipedia and is in the public domain

Mary Shelley By Richard Rothwell.

20th Century

A proliferation of cheap periodicals around the turn of the century led to a boom in horror writing.  For example, Gaston Leroux serialised his Le Fantome de l’Opera (The Phantom Of The Opera) before it became a novel in 1910.   One writer who specialised in horror fiction for mainstream pulps, such as All-Story Magazine, was Tod Robbins, whose fiction deals with themes of madness and cruelty.  In Russia, the writer Alexander Belyaev popularised these themes in his story Professor Dowell’s Head (1925), in which a mad doctor performs experimental head transplants and reanimations on bodies stolen from the morgue, and which was first published as a magazine serial before being turned into a novel.  Later, specialist publications emerged to give horror writers an outlet, prominent among them were Weird Tales and Unknown Worlds.

Influential horror writers of the early 20th century made inroads into these mediums.  Particularly, the venerated horror author H. P. Lovecraft, and his enduring Cthulhu Mythos transformed and popularised the genre of cosmic horror, and M. R. James is credited with redefining the ghost story in that era.

The serial murderer became a recurring theme.  Yellow journalism and sensationalism of various murderers, such as Jack the Ripper, and lesser so, Carl Panzram, Fritz Haarman, and Albert Fish, all perpetuated this phenomenon.  The trend continued in the postwar era, partly renewed after the murders committed by Ed Gein.  In 1959, Robert Bloch, inspired by the murders, wrote Psycho.  The crimes committed in 1969 by the Manson Family influenced the slasher theme in horror fiction of the 1970’s.  In 1981, Thomas Harris wrote Red Dragon, introducing Dr. Hannibal Lecter.  In 1988, the sequel to that novel, The Silence of the Lambs, was published.

Early cinema was inspired by many aspects of horror literature and started a strong tradition of horror films and subgenres that continues to this day.  Up until the graphic depictions of violence and gore on the screen commonly associated with 1960’s and 1970’s slasher films and splatter films, comic books such as those published by EC Comics (most notably Tales From The Crypt) in the 1950’s satisfied readers’ quests for horror imagery that the silver screen could not provide.  This imagery made these comics controversial, and as a consequence, they were frequently censored.

The modern zombie tale dealing with the motif of the living dead harks back to works including H. P. Lovecraft’s stories Cool Air (1925), In The Vault (1926), and The Outsider (1926), and Dennis Wheatley’s Strange Conflict (1941).  Richard Matheson’s novel I Am Legend (1954) influenced an entire genre of apocalyptic zombie fiction emblematized by the films of George A. Romero.

In the late 1960’s and early 1970’s, the enormous commercial success of three books – Rosemary’s Baby (1967) by Ira Levin, The Exorcist by William Peter Blatty, and The Other by Thomas Tryon encouraged publishers to begin releasing numerous other horror novels, thus creating a horror boom.

One of the best-known late-20th-century horror writers is Stephen King, known for Carrie, The Shining, It, Misery and several dozen other novels and about 200 short stories.  Beginning in the 1970’s, King’s stories have attracted a large audience, for which he was awarded by the U.S. National Book Foundation in 2003.  Other popular horror authors of the period included Anne Rice, Brian Lumley, Graham Masterton, James Herbert, Dean Koontz, Richard Laymon, Clive Barker, Ramsey Campbell, and Peter Straub.

Image © Pinguino Kolb via Wikipedia

Stephen King.

This photo of King was taken at the 2007 New York Comicon in America.

21st Century

Best-selling book series of contemporary times exist in genres related to horror fiction, such as the werewolf fiction urban fantasy Kitty Norville books by Carrie Vaughn (2005 onward).  Horror elements continue to expand outside the genre.  The alternate history of more traditional historical horror in Dan Simmons’s 2007 novel The Terror sits on bookstore shelves next to genre mash-ups such as Pride and Prejudice and Zombies (2009), and historical fantasy and horror comics such as Hellblazer (1993 onward) and Mike Mignola’s Hellboy (1993 onward).  Horror also serves as one of the central genres in more complex modern works such as Mark Z. Danielewski’s House of Leaves (2000), a finalist for the National Book Award.  There are many horror novels for children and teens, such as R. L. Stine’s Goosebumps series or The Monstrumologist by Rick Yancey.  Additionally, many movies for young audiences, particularly animated ones, use horror aesthetics and conventions, for example, ParaNorman. These are what can be collectively referred to as children’s horror.  Although it is unknown for sure why children enjoy these movies (as it seems counter-intuitive), it is theorised that it is, in part, grotesque monsters that fascinate kids.  Tangential to this, the internalised impact of horror television programs and films on children is rather under-researched, especially when compared to the research done on the similar subject of violence in TV and film’s impact on the young mind.  What little research there is tends to be inconclusive on the impact that viewing such media has.

Related Genres

Horror Characteristics

One defining trait of the horror genre is that it provokes an emotional, psychological, or physical response within readers that causes them to react with fear.  One of H. P. Lovecraft’s most famous quotes about the genre is “The oldest and strongest emotion of mankind is fear, and the oldest and strongest kind of fear is fear of the unknown.”.  This is the first sentence from his seminal essay, Supernatural Horror in Literature.  Science fiction historian Darrell Schweitzer has stated, “In the simplest sense, a horror story is one that scares us” and “the true horror story requires a sense of evil, not in necessarily in a theological sense, but the menaces must be truly menacing, life-destroying, and antithetical to happiness.”

In her essay Elements of Aversion, Elizabeth Barrette articulates the need by some for horror tales in a modern world.  She says, “The old fight or flight reaction of our evolutionary heritage once played a major role in the life of every human.  Our ancestors lived and died by it.  Then someone invented the fascinating game of civilization, and things began to calm down. Development pushed wilderness back from settled lands.  War, crime, and other forms of social violence came with civilization and humans started preying on each other, but by and large daily life calmed down.  We began to feel restless, to feel something missing, the excitement of living on the edge, the tension between hunter and hunted.  So we told each other stories through the long, dark nights. when the fires burned low, we did our best to scare the daylights out of each other.  The rush of adrenaline feels good.  Our hearts pound, our breath quickens, and we can imagine ourselves on the edge.  Yet we also appreciate the insightful aspects of horror. Sometimes a story intends to shock and disgust, but the best horror intends to rattle our cages and shake us out of our complacency.  It makes us think, forces us to confront ideas we might rather ignore, and challenges preconceptions of all kinds.  Horror reminds us that the world is not always as safe as it seems, which exercises our mental muscles and reminds us to keep a little healthy caution close at hand.”

In a sense similar to the reason a person seeks out the controlled thrill of a roller coaster, readers in the modern era seek out feelings of horror and terror to feel a sense of excitement.  However, Barrette adds that horror fiction is one of the few mediums where readers seek out a form of art that forces themselves to confront ideas and images they “might rather ignore to challenge preconceptions of all kinds.”

One can see the confrontation of ideas that readers and characters would rather ignore throughout literature in famous moments such as Hamlet’s musings about the skull of Yorick, its implications of the mortality of humanity, and the gruesome end that bodies inevitably come to.  In horror fiction, the confrontation with the gruesome is often a metaphor for the problems facing the current generation of the author.

There are many theories as to why people enjoy being scared. For example, people who like horror films are more likely to score highly for openness to experience, a personality trait linked to intellect and imagination.

It is a now commonly accepted view that the horror elements of Dracula’s portrayal of vampirism are metaphors for sexuality in a repressed Victorian era.  But this is merely one of many interpretations of the metaphor of Dracula.  Jack Halberstam postulates many of these in his essay Technologies of Monstrosity: Bram Stoker’s Dracula.  He writes, “[The] image of dusty and unused gold, coins from many nations and old unworn jewels, immediately connects Dracula to the old money of a corrupt class, to a kind of piracy of nations and to the worst excesses of the aristocracy.”

Halberstram articulates a view of Dracula as manifesting the growing perception of the aristocracy as an evil and outdated notion to be defeated.  The depiction of a multinational band of protagonists using the latest technologies (such as a telegraph) to quickly share, collate, and act upon new information is what leads to the destruction of the vampire.  This is one of many interpretations of the metaphor of only one central figure of the canon of horror fiction, as over a dozen possible metaphors are referenced in the analysis, from the religious to the antisemitic.

Noel Carroll’s Philosophy of Horror postulates that a modern piece of horror fiction’s monster, villain, or a more inclusive menace must exhibit the following two traits which is a menace that is threatening (either physically, psychologically, socially, morally, spiritually, or some combination of the aforementioned) and a menace that is impure (that violates the generally accepted schemes of cultural categorisation.  

Image by John Tenniel via Wikipedia and is in the public domain

The Irish Frankenstein by John Tenniel.

This illustration is from an 1882 issue of Punch and is anti-Irish propaganda.  Tenniel conceives the Irish Fenian movement as akin to Frankenstein’s monster, in the wake of the Phoenix Park killings.  Menacing villains and monsters in horror literature can often be seen as metaphors for the fears incarnate of a society.

Scholarship And Criticism

In addition to those essays and articles shown above, scholarship on horror fiction is almost as old as horror fiction itself.  In 1826, the gothic novelist Ann Radcliffe published an essay distinguishing two elements of horror fiction, terror and horror.  Whereas terror is a feeling of dread that takes place before an event happens, horror is a feeling of revulsion or disgust after an event has happened.  Radcliffe describes terror as that which expands the soul and awakens the faculties to a high degree of life, whereas horror is described as that which freezes and nearly annihilates them.

Modern scholarship on horror fiction draws upon a range of sources.  In their historical studies of the gothic novel, both Devandra Varma and S.L. Varnado make reference to the theologian Rudolf Otto, whose concept of the numinous was originally used to describe religious experience.

Awards And Associations

Achievements in horror fiction are recognised by numerous awards.  The Horror Writers Association presents the Bram Stoker Awards for Superior Achievement, named in honour of Bram Stoker, author of the seminal horror novel Dracula.  The Australian Horror Writers Association presents the annual Australian Shadows Awards.  The International Horror Guild Award was presented annually to works of horror and dark fantasy from 1995 to 2008.  The Shirley Jackson Awards are literary awards for outstanding achievement in the literature of psychological suspense, horror, and dark fantastic works.  Other important awards for horror literature are included as subcategories within general awards for fantasy and science fiction in such awards as the Aurealis Award.

Alternative Terms

Some writers of fiction normally classified as horror tend to dislike the term, considering it too lurid.  They instead use the terms dark fantasy or Gothic fantasy for supernatural horror, or psychological thriller for non-supernatural horror.

Horror Films Since The 1890’s

For more Horror film lists click here.

Read more about Horror and notes etc. regarding the above post here.

The above articles and the rest of the images on this page were sourced from Wikipedia and are subject to change.

Blog Posts

Links

Alexas_Fotos on Pixabay – The image shown at the top of this page is the copyright of Alexas_Fotos.  You can find more great work from the photographer Alexa and lots more free stock photos at Pixabay.

The image above of The Raven by Gustave Dore is via Wikipedia and is in the public domain.

The image above of Vlad the Impaler is unknown via Wikipedia and is in the public domain.

The image above of Horace Walpole by Joshua Reynolds is via Wikipedia and is in the public domain. 

The image above of Athenodorus by Henry Justice Ford is via Wikipedia and is in the public domain.

The image above of Mary Shelley by Richard Rothwell is via Wikipedia and is in the public domain.

The image shown above of Stephen King is the copyright of Wikipedia user Pinguino Kolb.  It comes with a Creative Commons licence (CC BY-SA 2.0)

The image above of The Irish Frankenstein by John Tenniel is via Wikipedia and is in the public domain.

Universal Pictures – U.K. official website.

Universal Pictures on YouTube.

Universal Pictures on Facebook.

Universal Pictures on Twitter.

Universal Studios – Official website.

Universal Studios on YouTube.

Universal Studios on Facebook.

Universal Studios on Twitter