Birmingham City: Keep Right On To The End Of The Road

B.C.F.C. Badge
Image © of B.C.F.C.

There is only one team in Birmingham worth supporting with true passion and Birmingham City is it.  I have been supporting them since 1978 when Jim Smith was the manager.  He is my favourite manager to date.   I am a blue nose ’til I die.

You can read lots more about Blues by clicking here

The following article is all about Blues current situation in the Championship as of March the 13th, 2024. 

It also appears (slightly edited) on Blues Focus.   It is my first article for them.  Click here to view.

How Are We Doing?  

You can see all our results from this season so far here.

As of the time of writing this on March the 13th, 2024 Blues are uncomfortably lying in 21st place,  just one point from the bottom three.  We have only drawn one game in the last five.  Our last game was our game in hand against Middlesbrough which was a must-win and we didn’t.

Is It Time To Worry?  

Worry! Yes and No! Be very anxious, most definitely!

Being worried for me depends if we can get a much-needed three points against Watford in our next game and it depends if we drop into the bottom three and stay there by the middle of April. 

We have eight games to play after Watford and I am not going to lie, it is now at the squeaky bum time of the season but we have escaped relegation before.  Will our luck run out finally and we go down to League One? We will know that for sure come our last game on the 4th of May but until it is mathematically proven we are down then the season isn’t over yet despite a majority of so-called fans on social media throwing the towel in already.

I am trying hard to keep the faith.  I’ll be honest, it’s been harder than other times we have been in this situation but I always try to take the positives out of every game and I always have hope, not just in football but in life itself.  It’s what keeps me going.  I will always back the team and most of all the badge, which will always come first for me regardless of who is playing, what manager is in charge, who owns the club or what league we are in.

Make no bones about it, April is going to be tough for us but it’s not like we haven’t been in a relegation fight before.  Blues do what we do best and that is fight until the end.  Now is not the time to point fingers and blame so and so but a time for players and fans to stay strong, TOGETHER.

Will Tony Mowbray Be Back In Charge For The Q.P.R. Game?  

It was rumoured today on social media that the Blues boss Tony Mowbray would be back after being away since the 19th of February for the home game against Queen’s Park Rangers on March the 29th.

I take rumours online with a pinch of salt but I do hope it is true, all Blues fans will feel the same I am sure, but I only want that if he is truly well enough for the task of the tough days ahead until the end of the season.  We need him now, more than ever.  

Do We Get A Temporary Manager In If Mowbray Doesn’t  Return Soon?

Some “fans” have suggested on social media that Chairman Of The Board Tom Wagner should bring in another temporary manager to take over from current temporary manager Mark Venus because he isn’t cutting the mustard.  These comments are ridiculous.  This would not only be disrespectful to Venus, but it would be especially disrespectful to Mowbray and undermine his authority.  If it were to happen he would resign for sure, as would Venus and the rest of the staff and then the temporary manager may stay, he may not and then we have to get yet another new manager and staff and hope they will be successful.  It would be total madness and have the club moving backwards, not forward and I can’t see Wagner wanting that to happen.

What If?

What if the worst comes to the worst and Blues do get relegated to League One? Then what? We keep supporting them regardless that’s what, it’s what a true, loyal Blues fan does.  Of course, it would be heartbreaking but it would not be the end of the world.

We dropped down to the old third division in 1989 for the first time in our history and it wasn’t until 2002 that we were in the Premiership (established in 1992).  Under the shrewd ownership of Knighthead Co-Founder and Co-C.E.O. Tom Wagner and the experience of manager Tony Mowbray, it won’t take as long to get back to the glory days, I am sure of that.  

Will Wagner And Knighthead Capital Depart If We Get Relegated?

Some “fans” on social media believe this will happen, but I don’t.  Wagner doesn’t look like someone who is a quitter.  He and Knighthead have invested too much time and money into Birmingham City and financially it wouldn’t make sense for them to move on without making a profit on their investment.  Wagner said it was a long-term plan to get Birmingham City back into the Premiership and, as a man of integrity,  I do not believe he would go back on his word and do that to the Blues fans.  

And Finally

Albeit we stay up or go down, win, lose or draw, everyone has a right to an opinion and to get angry, sad or whatever emotion the joys and sorrows of being a Blues fan entails but being aggressive and abusive to your own supporters,  players, manager or owners online or to their faces can’t be condoned.  I think it is better not to comment at all as things get said that shouldn’t be in the heat of the moment.  Even if what you say is meant in good faith, your words easily get misconstrued and twisted on social media and get used against you.

I am an empathetic and passionate bloke and have supported Blues since the late 70’s.  Since my teens, I have battled with depression and anxiety and at the moment,  I just can’t watch them play lately.  It does not help my mental health at all.  My anxiety and stress levels have been too high for my comfort lately. 

It’s like watching a family member or your pet suffer.  It’s heartbreaking.  However, that doesn’t make me not care what’s going on, I just currently can’t take the stress.  It genuinely makes me feel ill.

I always say, for me, supporting Blues is like having a girlfriend or wife that pisses me off or makes me sad, I love her and I forgive her but at times I need my own space, ha ha, but as the header says above, true Blues fans will always KEEP RIGHT ON TO THE END OF THE ROAD.

Blog Posts

Notes And Links

The Birmingham City Club logo shown at the top of this page is the copyright of Birmingham City F.C. and has come from Blues social media pages and website. 

Birmingham City F.C. – Official website.  

Birmingham City on Facebook  – This is their official Facebook page.

Birmingham City on Twitter – This is their official Twitter page.

Birmingham City on YouTube – This is their official YouTube page.

Blues Store Online – Birmingham City’s official club store online.

Blues Focus – Official website.  Bringing you closer to Birmingham City Football Club.

Blues Focus on Facebook – This is their official Facebook page.

Blues Focus on Twitter – This is their official Twitter page.

Blues Focus on Instagram – This is their official Instagram page.

Blues Focus on YouTube – This is their official YouTube channel.

Christmas

Image © of Liliboas via iStock

I have many happy memories over the decades, especially family ones from when I was younger in the 70’s and ’80s and when my kids were younger.  Sadly my mental health suffered in my adult years, especially in the 2010’s right up to the start of the 2020’s and it was difficult to enjoy them and love them like I used to but thankfully I can start to LOVE CHRISTMAS again.

For me, Christmas is about being with family and friends.  It is enjoying good company and eating, drinking and being merry.  It is reminiscing about the happy Christmases of old and remembering people and animals that shared those precious times with us but are no longer here with us.  It is about wonderful Christmas trees and the giving and recieving of presents.  It is about the beautiful colours that come with it.  It is about traditions.  It is about listening to Christmas music and watching Christmas films and programmes. It is about the spirit of Christmas and the feeling of peace.  It is not just a holiday, it is a state of mind.  

Living in the mostly Christian country of England when I was younger (not so much now) and being a former Christian myself I always celebrated Christmas regarding the birth of Jesus Christ.

The older I got, as an atheist, I came to realise the bible just contradicts itself and is full of fictional stories.  The date of that birth itself, December the 25th, can’t be agreed upon or proved throughout the centuries (and I’m not bothering to cover all that below) but to be honest I don’t care about the date or what did or didn’t happen on it or if anyone involved with it is real but that is not here or there.

I am someone who tries hard to avoid talking about religion, royalty and politics but it would be impossible to talk about Christmas and not refer to religion regarding what is written below, however, it is written respectfully.  As I have always said about religion, as long as it doesn’t involve harm or hatred and is peaceful, I will respect your right to believe whatever you like as long as you respect my right not to believe.  Royalty and politics are briefly mentioned as it is hard to avoid them when it is part of Christmas history but mainly I wanted to keep this page interesting and informative about Christmas.

If you are reading this in December then have a very HAPPY CHRISTMAS!

Image © of Crumpled Fire via Wikipedia

A Nativity Scene made with Christmas lights.

About Christmas

Christmas is an annual festival commemorating the birth of Jesus Christ, primarily observed on December the 25th as a religious and cultural celebration among billions of people around the world.  A feast central to the Christian liturgical year, it follows the season of Advent (which begins four Sundays before) or the Nativity Fast, and initiates the season of Christmastide, which historically in the West lasts twelve days and culminates on Twelfth Night.  Christmas Day is a public holiday in many countries, is celebrated religiously by a majority of Christians, as well as culturally by many non-Christians, and forms an integral part of the holiday season organised around it.

The traditional Christmas narrative recounted in the New Testament, known as the Nativity of Jesus, says that Jesus was born in Bethlehem, under messianic prophecies.  When Joseph and Mary arrived in the city, the inn had no room so they were offered a stable where the Christ Child was soon born, with angels proclaiming this news to shepherds who then spread the word.

There are different hypotheses regarding the date of Jesus’ birth and in the early fourth century, the church fixed the date as December the 25th.  This corresponds to the traditional date of the winter solstice on the Roman calendar.  It is exactly nine months after the Annunciation on March the 25th, also the date of the spring equinox.  Most Christians celebrate on December the 25th in the Gregorian calendar, which has been adopted almost universally in the civil calendars used in countries worldwide.  However, some of the Eastern Christian Churches celebrate Christmas on December the 25th of the older Julian calendar, which currently corresponds to January the 7th in the Gregorian calendar.  For Christians, believing that God came into the world in the form of man to atone for the sins of humanity, rather than knowing Jesus’ exact birth date, is considered to be the primary purpose of celebrating Christmas.

The celebratory customs associated in various countries with Christmas have a mix of pre-Christian, Christian, and secular themes and origins.  Popular modern customs of the holiday include gift giving, completing an Advent calendar or Advent wreath, Christmas music and caroling, watching Christmas movies, viewing a Nativity play, an exchange of Christmas cards, church services, a special meal, and the display of various Christmas decorations, including Christmas trees, Christmas lights, nativity scenes, garlands, wreaths, mistletoe, and holly. In addition, several closely related and often interchangeable figures, known as Father Christmas, Santa Claus,  Saint Nicholas, and the Christkind, are associated with bringing gifts to children during Christmas and have their own body of traditions and lore.  Because gift-giving and many other aspects of the Christmas festival involve heightened economic activity, the holiday has become a significant event and a key sales period for retailers and businesses.   Over the past few centuries, Christmas has had a steadily growing economic effect in many regions of the world. 

Etymology

Other Names 

In addition to Christmas, the holiday has had various other English names throughout its history.  The Anglo-Saxons referred to the feast as midwinter, or, more rarely, as Nātiuiteð, which comes from the Latin nātīvitās.  Nativity, meaning birth, is also from the Latin nātīvitāsIn Old English, Gēola (Yule) referred to the period corresponding to December and January, which was eventually equated with Christian Christmas.  Noel (also Nowel or Nowell, as in The First Nowell) entered English in the late 14th century and is from the Old French noël or naël, itself ultimately from the Latin nātālis (diēs) meaning birth (day).

Koleda is the traditional Slavic name for Christmas and the period from Christmas to Epiphany or, more generally, to Slavic Christmas-related rituals, some dating to pre-Christian times.

The History Of Christmas

In the 2nd century, the earliest church records indicate that Christians were remembering and celebrating the birth of Jesus, an observance that sprang up organically from the authentic devotion of ordinary believers although a set date was not agreed on.  Though Christmas did not appear on the lists of festivals given by the early Christian writers Irenaeus and Tertullian, the early Church Fathers John Chrysostom, Augustine of Hippo, and Jerome attested to December the 25th as the date of Christmas toward the end of the fourth century.  A passage in Commentary on the Prophet Daniel (AD 204) by Hippolytus of Rome identifies December the 25th as Jesus’s birth date, but this passage is considered a later interpolation.

In the East, the birth of Jesus was celebrated in connection with the Epiphany on January the 6th.  This holiday was not primarily about Christ’s birth, but rather his baptism.  Christmas was promoted in the East as part of the revival of Orthodox Christianity that followed the death of the pro-Arian Emperor Valens at the Battle of Adrianople in 378.  The feast was introduced in Constantinople in 379, in Antioch by John Chrysostom towards the end of the fourth century, probably in 388, and in Alexandria in the following century.  The Georgian Iadgari demonstrates that Christmas was celebrated in Jerusalem by the sixth century.

Post-Classical History

Christmas played a role in the Arian controversy of the fourth century.   After this controversy ran its course, the prominence of the holiday declined for a few centuries.

In the Early Middle Ages, Christmas Day was overshadowed by Epiphany, which in Western Christianity focused on the visit of the magi.  However, the medieval calendar was dominated by Christmas-related holidays.  The forty days before Christmas became the forty days of St. Martin (which began on November the 11th, the feast of St. Martin of Tours), now known as Advent.  In Italy, former Saturnalian traditions were attached to Advent.  Around the 12th century, these traditions transferred again to the Twelve Days of Christmas (December the 25th to January the 5th).  This is a time that appears in the liturgical calendars as Christmastide or Twelve Holy Days.

In 567, the Council of Tours put in place the season of Christmastide, proclaiming the twelve days from Christmas to Epiphany as a sacred and festive season, and established the duty of Advent fasting in preparation for the feast.  This was done to solve the administrative problem for the Roman Empire as it tried to coordinate the solar Julian calendar with the lunar calendars of its provinces in the east.

The prominence of Christmas Day increased gradually after Charlemagne was crowned Emperor on Christmas Day in 800.  King Edmund the Martyr was anointed on Christmas in 855 and King William I of England was crowned on Christmas Day 1066.

By the High Middle Ages, the holiday had become so prominent that chroniclers routinely noted where various magnates celebrated Christmas.  King Richard II of England hosted a Christmas feast in 1377 at which 28 oxen and 300 sheep were eaten.  The Yule boar was a common feature of medieval Christmas feasts.  Carolling also became popular and was originally performed by a group of dancers who sang.  The group was composed of a lead singer and a ring of dancers that provided the chorus.  Various writers of the time condemned carolling as lewd, indicating that the unruly traditions of Saturnalia and Yule may have continued in this form.  Misrule (drunkenness, promiscuity, gambling) was also an important aspect of the festival.  In England, gifts were exchanged on New Year’s Day, and there was a special Christmas ale.

Christmas during the Middle Ages was a public festival that incorporated ivy, holly, and other evergreens. Christmas gift-giving during the Middle Ages was usually between people with legal relationships, such as tenants and landlords.  The annual indulgence in eating, dancing, singing, sporting, and card playing escalated in England, and by the 17th century, the Christmas season featured lavish dinners, elaborate masques, and pageants.  In 1607, King James I insisted that a play be acted on Christmas night and that the court indulge in games.  It was during the Reformation in 16th – 17th-century Europe that many Protestants changed the gift bringer to the Christ Child or Christkindl, and the date of giving gifts changed from December the 6th to Christmas Eve.

Image is by unknown via Wikipedia and is in the public domain

The Nativity by unknown.

This beautiful image comes from a 14th-century Missal.  It is made from parchment and originates from East Anglia.   It is considered a very important manuscript as it is one of the earliest examples of a Missal of an English source. 

Sarum Missals were books produced by the Church during the Middle Ages for celebrating Mass throughout the year

Image is by Julius Schnorr von Carolsfeld via Wikipedia and is in the public domain

The Coronation of Charlemagne on Christmas of 800 by Julius Schnorr von Carolsfeld.

Modern History

17th And 18th Centuries

Following the Protestant Reformation, many of the new denominations, including the Anglican Church and Lutheran Church, continued to celebrate Christmas.  In 1629, the Anglican poet John Milton penned On the Morning of Christ’s Nativity, a poem that has since been read by many during Christmastide.  Donald Heinz, a professor at California State University, states that Martin Luther inaugurated a period in which Germany would produce a unique culture of Christmas, much copied in North America.  Among the congregations of the Dutch Reformed Church, Christmas was celebrated as one of the principal evangelical feasts.

However, in 17th century England, some groups such as the Puritans strongly condemned the celebration of Christmas, considering it a Catholic invention and the trappings of popery or the rags of the Beast.  In contrast, the established Anglican Church pressed for a more elaborate observance of feasts, penitential seasons, and saints’ days.  The calendar reform became a major point of tension between the Anglican party and the Puritan party.  The Catholic Church also responded, promoting the festival in a more religiously oriented form.  King Charles I of England directed his noblemen and gentry to return to their landed estates in midwinter to keep up their old-style Christmas generosity.  Following the Parliamentarian victory over Charles I during the English Civil War, England’s Puritan rulers banned Christmas in 1647.

Protests followed as pro-Christmas rioting broke out in several cities and for weeks Canterbury was controlled by the rioters, who decorated doorways with holly and shouted royalist slogans.  Football, among the sports the Puritans banned on a Sunday, was also used as a rebellious force.  When Puritans outlawed Christmas in England in December 1647 the crowd brought out footballs as a symbol of festive misrule.  The book, The Vindication of Christmas (London, 1652), argued against the Puritans and makes note of Old English Christmas traditions, dinner, roast apples on the fire, card playing, dances with plow-boys and maidservants, old Father Christmas and carol singing.  During the ban, semi-clandestine religious services marking Christ’s birth continued to be held, and people sang carols in secret.

It was restored as a legal holiday in England with the Restoration of King Charles II in 1660 when Puritan legislation was declared null and void, with Christmas again freely celebrated in England.  Many Calvinist clergymen disapproved of Christmas celebrations.  As such, in Scotland, the Presbyterian Church of Scotland discouraged the observance of Christmas, and though James VI commanded its celebration in 1618, church attendance was scant.  The Parliament of Scotland officially abolished the observance of Christmas in 1640, claiming that the church had been purged of all superstitious observation of days.  Whereas in England, Wales and Ireland Christmas Day is a common law holiday, having been a customary holiday since time immemorial, it was not until 1871 that it was designated a bank holiday in Scotland.  The diary of James Woodforde, from the latter half of the 18th century, details the observance of Christmas and celebrations associated with the season over several years.

As in England, Puritans in Colonial America staunchly opposed the observation of Christmas.  The Pilgrims of New England pointedly spent their first 25th of December in the New World working normally.  Puritans such as Cotton Mather condemned Christmas both because scripture did not mention its observance and because Christmas celebrations of the day often involved boisterous behaviour.  Many non-Puritans in New England deplored the loss of the holidays enjoyed by the labouring classes in England.  Christmas observance was outlawed in Boston in 1659.  The ban on Christmas observance was revoked in 1681 by English governor Edmund Andros, but it was not until the mid-19th century that celebrating Christmas became fashionable in the Boston region.

At the same time, Christian residents of Virginia and New York observed the holiday freely.  Pennsylvania Dutch settlers, predominantly Moravian settlers of Bethlehem, Nazareth, and Lititz in Pennsylvania and the Wachovia settlements in North Carolina, were enthusiastic celebrators of Christmas.  The Moravians in Bethlehem had the first Christmas trees in America as well as the first Nativity Scenes.  Christmas fell out of favour in the United States after the American Revolution, when it was considered an English custom.  George Washington attacked Hessian (German) mercenaries on the day after Christmas during the Battle of Trenton on December the 26th, 1776.  Christmas was much more popular in Germany than in America at this time.

With the atheistic Cult of Reason in power during the era of Revolutionary France, Christian Christmas religious services were banned and the Three Kings cake was renamed the equality cake under anticlerical government policies.

Image is by Josiah King via Wikipedia and is in the public domain

The Examination and Tryal of Old Father Christmas by Josiah King.

This was published after Christmas and reinstated as a holy day in England.  It shows the frontispiece to King’s pamphlet The Examination and Tryal of Old Father Christmas, published in 1687. He had previously published a pamphlet with a very similar title The Examination and Tryall of Old Father Christmas in 1658 using the same image as the frontispiece.

19th Century

In the early 19th century, Christmas festivities and services became widespread with the rise of the Oxford Movement in the Church of England that emphasised the centrality of Christmas in Christianity and charity to the poor, along with Charles Dickens, Washington Irving, and other authors emphasising family, children, kind-heartedness, gift-giving, and Father Christmas (for Dickens) or Santa Claus (for Irving).

In the early-19th century, writers imagined Tudor-period Christmas as a time of heartfelt celebration. In 1843, Charles Dickens wrote the novel A Christmas Carol, which helped revive the spirit of Christmas and seasonal merriment.  Its instant popularity played a major role in portraying Christmas as a holiday emphasising family, goodwill, and compassion.

Dickens sought to construct Christmas as a family-centred festival of generosity, linking worship and feasting, within a context of social reconciliation.  Superimposing his humanitarian vision of the holiday, in what has been termed Carol Philosophy, Dickens influenced many aspects of Christmas that are celebrated today in Western culture, such as family gatherings, seasonal food and drink, dancing, games, and a festive generosity of spirit.  A prominent phrase from the tale, Merry Christmas, was popularised following the appearance of the story.  This coincided with the appearance of the Oxford Movement and the growth of Anglo-Catholicism, which led to a revival in traditional rituals and religious observances.

In 1822, Clement Clarke Moore wrote the poem A Visit From St. Nicholas (popularly known by its first line Twas the Night Before Christmas).  The poem helped popularise the tradition of exchanging gifts, and seasonal Christmas shopping began to assume economic importance.  This also started the cultural conflict between the holiday’s spiritual significance and its associated commercialism which some see as corrupting the holiday.  In her 1850 book The First Christmas in New England, Harriet Beecher Stowe includes a character who complains that the true meaning of Christmas was lost in a shopping spree.

While the celebration of Christmas was not yet customary in some regions in the U.S., Henry Wadsworth Longfellow detected a transition state about Christmas in New England in 1856.  He stated that the old Puritan feeling prevented it from being a cheerful, hearty holiday, though every year made it more so.  In Reading, Pennsylvania, a newspaper remarked in 1861, that “even our Presbyterian friends who have hitherto steadfastly ignored Christmas threw open their church doors and assembled in force to celebrate the anniversary of the Savior’s birth.”

The First Congregational Church of Rockford, Illinois, (although of genuine Puritan stock) was preparing for a grand Christmas jubilee, a news correspondent reported in 1864.  By 1860, fourteen states including several from New England had adopted Christmas as a legal holiday.  In 1875, Louis Prang introduced the Christmas card to Americans.  He has been called the father of the American Christmas card.  On June the 28th, 1870, Christmas was formally declared a United States federal holiday.

Image by John Leech via Wikipedia and is in the public domain

Scrooge’s Third Visitor by John Leech.

This image is from Charles Dickens’ A Christmas Carol published in 1843.  It is from one of four hand-coloured etchings included in the first edition.  There were also four black and white engravings.

Image by Joseph Lionel Williams via Wikipedia and is in the public domain

The Queen’s Christmas Tree at Windsor Castle by Joseph Lionel Williams.

This wood engraving print was made for The Illustrated London News, Christmas Number 1848.

Image by Adolph Tidemand via Wikipedia and is in the public domain

A Norwegian Christmas by Adolph Tidemand.

This painting is from 1846.

20th Century

During the First World War and particularly (but not exclusively) in 1914, a series of informal truces took place for Christmas between opposing armies.  The truces, which were organised spontaneously by fighting men, ranged from promises not to shoot (shouted at a distance to ease the pressure of war for the day) to friendly socialising, gift-giving and even sport between enemies.  These incidents became a well-known and semi-mythologised part of popular memory.  They have been described as a symbol of common humanity even in the darkest of situations and used to demonstrate to children the ideals of Christmas.

Up to the 1950’s in the United Kingdom, many Christmas customs were restricted to the upper and middle classes.   Most of the population had not yet adopted many Christmas rituals that later became popular, including Christmas trees.  Christmas dinner would normally include beef or goose, not turkey as would later be common.  Children would get fruit and sweets in their stockings rather than elaborate gifts.  The full celebration of a family Christmas with all the trimmings only became widespread with increased prosperity from the 1950’s.  National papers were published on Christmas Day until 1912.  Post was still delivered on Christmas Day until 1961.  League football matches continued in Scotland until the 1970’s while in England they ceased at the end of the 1950’s.

Image by unknown via Wikipedia and is in the public domain

The Christmas Visit by unknown.

This postcard is from circa 1910. 

Nativity

The gospels of Luke and Matthew describe Jesus as being born in Bethlehem to the Virgin Mary.   In the Gospel of Luke, Joseph and Mary travel from Nazareth to Bethlehem to be counted for a census, and Jesus is born there and placed in a manger. Angels proclaim him a saviour for all people, and three shepherds come to adore him.  In the Gospel of Matthew, by contrast, three magi follow a star to Bethlehem to bring gifts to Jesus, born the king of the Jews.  King Herod orders the massacre of all the boys less than two years old in Bethlehem, but the family flees to Egypt and later returns to Nazareth.

Read more about The Nativity here.

Image is by Gerard van Honthorst via Wikipedia and is in the public domain

Adoration of the Shepherds by Gerard van Honthorst.

This painting of Mary, Jesus and the shepherds was created in 1622.

Relation To Concurrent Celebrations

Many popular customs associated with Christmas developed independently of the commemoration of Jesus’ birth, with some claiming that certain elements are Christianised and have origins in pre-Christian festivals that were celebrated by pagan populations who were later converted to Christianity.  Other scholars reject these claims and affirm that Christmas customs largely developed in a Christian context.  The prevailing atmosphere of Christmas has also continually evolved since the holiday’s inception, ranging from a sometimes raucous, drunken, carnival-like state in the Middle Ages, to a tamer family-oriented and children-centered theme introduced in a 19th-century transformation.  The celebration of Christmas was banned on more than one occasion within certain groups, such as the Puritans and Jehovah’s Witnesses (who do not celebrate birthdays in general), due to concerns that it was too unbiblical.

Prior to and through the early Christian centuries, winter festivals were the most popular of the year in many European pagan cultures.  Reasons included the fact that less agricultural work needed to be done during the winter, as well as an expectation of better weather as spring approached.  Celtic winter herbs such as mistletoe and ivy, and the custom of kissing under a mistletoe, are common in modern Christmas celebrations in the English-speaking countries.

The pre-Christian Germanic peoples (including the Anglo-Saxons and the Norse) celebrated a winter festival called Yule, held in the late December to early January period, yielding modern English yule, today used as a synonym for Christmas.  In Germanic language-speaking areas, numerous elements of modern Christmas folk custom and iconography may have originated from Yule, including the Yule log, Yule boar, and the Yule goat.  Often leading a ghostly procession through the sky (the Wild Hunt), the long-bearded god Odin is referred to as the Yule one and Yule father in Old Norse texts, while other gods are referred to as Yule beings.  On the other hand, as there are no reliable existing references to a Christmas log prior to the 16th century, the burning of the Christmas block may have been an early modern invention by Christians unrelated to the pagan practice.

In eastern Europe also, pre-Christian traditions were incorporated into Christmas celebrations there, an example being the Koleda, which shares parallels with the Christmas carol.

Image is by Herrad of Landsberg via Wikipedia and is in the public domain

The Nativity of Christ by Herrad of Landsberg.

This 12th-century, medieval illustration is from the Hortus deliciarum.

Observance And Traditions

Christmas Day is celebrated as a major festival and public holiday in countries around the world, including many whose populations are mostly non-Christian. In some non-Christian areas, periods of former colonial rule introduced the celebration (e.g. Hong Kong); in others, Christian minorities or foreign cultural influences have led populations to observe the holiday. Countries such as Japan, where Christmas is popular despite there being only a small number of Christians, have adopted many of the cultural aspects of Christmas, such as gift-giving, decorations, and Christmas trees. A similar example is in Turkey, being Muslim-majority and with a small number of Christians, where Christmas trees and decorations tend to line public streets during the festival.

Among countries with a strong Christian tradition, a variety of Christmas celebrations have developed that incorporate regional and local cultures.

Read more about Observance And Traditions here and here.

Image © Israel Press and Photo Agency via Wikipedia

Christmas at the Annunciation Church in Nazareth.

This photo by Dan Hadani, from his collection Collection at the National Library of Israel, was taken on Christmas Eve, 1965.

Decorations

Nativity scenes are known from 10th-century Rome. They were popularised by Saint Francis of Assisi from 1223, quickly spreading across Europe.  Different types of decorations developed across the Christian world, dependent on local tradition and available resources, and can vary from simple representations of the crib to far more elaborate sets.  Renowned manger scene traditions include the colourful Krakow szopka in Poland, which imitate Krakow’s historical buildings as settings, the elaborate Italian presepi (Neapolitan, Genoese and Bolognese), or the Provencal creches in southern France, using hand-painted terracotta figurines called santons.  In certain parts of the world, notably Sicily, living nativity scenes following the tradition of Saint Francis are a popular alternative to static creches.  The first commercially produced decorations appeared in Germany in the 1860’s, inspired by paper chains made by children.  In countries where a representation of the Nativity scene is very popular, people are encouraged to compete and create the most original or realistic ones.  Within some families, the pieces used to make the representation are considered a valuable family heirloom.

The traditional colours of Christmas decorations are red, green, and gold.  Red symbolises the blood of Jesus, which was shed in his crucifixion, green symbolises eternal life, and in particular the evergreen tree, which does not lose its leaves in the winter and gold is the first colour associated with Christmas, as one of the three gifts of the Magi, symbolising royalty.

The Christmas tree was first used by German Lutherans in the 16th century, with records indicating that a Christmas tree was placed in the Cathedral of Strassburg in 1539, under the leadership of the Protestant Reformer, Martin Bucer.  In the United States, these German Lutherans brought the decorated Christmas tree with them.  The Moravians put lighted candles on the trees.  When decorating the Christmas tree, many individuals place a star at the top of the tree symbolising the Star of Bethlehem, a fact recorded by The School Journal in 1897.  Professor David Albert Jones of Oxford University wrote that in the 19th century, it became popular for people to also use an angel to top the Christmas tree in order to symbolise the angels mentioned in the accounts of the Nativity of Jesus.   Aditionally, in the context of a Christian celebration of Christmas, the Christmas tree, being evergreen in colour, is symbolic of Christ, who offers eternal life and the candles or lights on the tree represent the Light of the World.  Christian services for family use and public worship have been published for the blessing of a Christmas tree, after it has been erected.  The Christmas tree is considered by some as Christianisation of pagan tradition and ritual surrounding the Winter Solstice, which included the use of evergreen boughs, and an adaptation of pagan tree worship.  According to eighth-century biographer Æddi Stephanus, Saint Boniface (634 – 709), who was a missionary in Germany, took an ax to an oak tree dedicated to Thor and pointed out a fir tree, which he stated was a more fitting object of reverence because it pointed to heaven and it had a triangular shape, which he said was symbolic of the Trinity.  The English language phrase Christmas tree is first recorded in 1835 and represents an importation from the German language.

Since the 16th century, the poinsettia, a native plant from Mexico, has been associated with Christmas carrying the Christian symbolism of the Star of Bethlehem; in that country it is known in Spanish as the Flower of the Holy Night. Other popular holiday plants include holly, mistletoe, red amaryllis, and Christmas cactus.

Other traditional decorations include bells, candles, candy canes, stockings, wreaths, and angels.  Both the displaying of wreaths and candles in each window are a more traditional Christmas display.  The concentric assortment of leaves, usually from an evergreen, make up Christmas wreaths and are designed to prepare Christians for the Advent season.  Candles in each window are meant to demonstrate the fact that Christians believe that Jesus Christ is the ultimate light of the world.

Christmas lights and banners may be hung along streets, music played from speakers, and Christmas trees placed in prominent places.  It is common in many parts of the world for town squares and consumer shopping areas to sponsor and display decorations.  Rolls of brightly coloured paper with secular or religious Christmas motifs are manufactured to wrap gifts.  In some countries, Christmas decorations are traditionally taken down on the Twelfth Night.

Read more about Decorations here and here.

Image by unknown is from the Diocesan Museum of Sacred Art via Wikipedia

A typical Neapolitan Nativity scene by unknown.

This Eighteenth-century nativity scene painting is also known as a presepe or presepio and can be found at the Diocesan Museum of Sacred Art in Bilbao, Spain.  

Local creches are renowned for their ornate decorations and symbolic figurines, often mirroring daily life.

Image © of TaniaLuz via iStock

A Christmas tree and presents.

Image by Robert Knudsen is from the Kennedy Library via Wikipedia and is in the public domain

The official White House Christmas tree for 1962 by Robert Knudsen.

The official White House Christmas tree above is in the entrance hall.  It is usually located in the Blue Room, this was one of a few instances since 1961 where the tree has been displayed here.

It was presented by President John F. Kennedy and First Lady Jacqueline Kennedy at the Christmas Reception on the 12th of December, 1962 at the White House, U.S.A. 

Image © of PFAStudent via Wikipedia

The Christ Candle in the centre of an Advent wreath.

This is traditionally lit in many church services.  This one is in the chancel of Broadway United Methodist Church, located in New Philadelphia, U.S.A.

The Advent wreath consists of four coloured candles of the same size, arranged around a larger white Christ candle.

Nativity Play

For the Christian celebration of Christmas, the viewing of the Nativity play is one of the oldest Christmastime traditions, with the first reenactment of the Nativity of Jesus taking place in 1223 A.D.  In that year, Francis of Assisi assembled a Nativity scene outside of his church in Italy and children sung Christmas carols celebrating the birth of Jesus.  Each year, this grew larger and people travelled from afar to see Francis’ depiction of the Nativity of Jesus that came to feature drama and music.  Nativity plays eventually spread throughout all of Europe, where they remain popular.  Christmas Eve and Christmas Day church services often came to feature Nativity plays, as did schools and theatres.  In France, Germany, Mexico and Spain, Nativity plays are often reenacted outdoors in the streets.

Read more about Nativity Play here.

Image © of Wesley Fryer via Wikipedia

Children in Oklahoma reenact a Nativity play.

These children are performing their nativity play in 2007 at the First Presbyterian Church in Edmond, Oklahoma, U.S.A.

Music And Carols

The earliest extant specifically Christmas hymns appear in fourth-century Rome.  Latin hymns such as Veni redemptor gentium, written by Ambrose, Archbishop of Milan, were austere statements of the theological doctrine of the Incarnation in opposition to Arianism.  Corde natus ex Parentis (Of the Father’s love begotten) by the Spanish poet Prudentius (died 413) is still sung in some churches today.  In the 9th and 10th centuries, the Christmas Sequence or Prose was introduced in North European monasteries, developing under Bernard of Clairvaux into a sequence of rhymed stanzas. In the 12th century the Parisian monk Adam of St. Victor began to derive music from popular songs, introducing something closer to the traditional Christmas carol.  Christmas carols in English appear in a 1426 work of John Awdlay who lists twenty-five “caroles of Cristemas”, probably sung by groups of wassailers, who went from house to house.

Read more about Music And Carols here.

Christmas carolers in Jersey.

Image © of Man vyi via Wikipedia and is in the public domain
Image by unknown is via Wikipedia and is in the public domain

Child singers in Bucharest by unknown.

This picture is from 1842 and depicts the singers carrying a star with an icon of a saint on it.

Christmas Food

A special Christmas family meal is traditionally an important part of the holiday celebration, and the food that is served varies greatly from country to country.  Some regions have special meals for Christmas Eve, such as Sicily, where 12 kinds of fish are served.  In the United Kingdom and countries influenced by its traditions, a standard Christmas meal usually includes turkey, goose or other large bird, gravy, potatoes, vegetables, sometimes bread, cider or some other alcoholic drink for the adults.  Special desserts are also prepared, such as Christmas pudding, mince pies, Christmas cake, Panettone and a Yule log cake.  A traditional Christmas meal in Central Europe features fried carp or other fish.

Read more about Christmas Food here.

Image © of Austin McGee via Wikipedia

A Christmas dinner setting.

Christmas Cards

Christmas cards are illustrated messages of greeting exchanged between friends and family members during the weeks preceding Christmas Day.  The traditional greeting reads wishing you a Merry Christmas and a Happy New Year, much like that of the first commercial Christmas card, produced by Sir Henry Cole in London in 1843.  The custom of sending them has become popular among a wide cross-section of people with the emergence of the modern trend towards exchanging E-cards.

Christmas cards are purchased in considerable quantities and feature artwork, is commercially designed and relevant to the season.  The content of the design might relate directly to the Christmas narrative, with depictions of the Nativity of Jesus, or Christian symbols such as the Star of Bethlehem, or a white dove, which can represent both the Holy Spirit and Peace on Earth.  Other Christmas cards are more secular and can depict Christmas traditions, mythical figures such as Father Christmas, objects directly associated with Christmas such as candles, holly, and baubles, or a variety of images associated with the season, such as Christmastide activities, snow scenes, and the wildlife of the northern winter.

Some prefer cards with a poem, prayer, or Biblical verse, while others distance themselves from religion with an all-inclusive Season’s greetings.

Read more about Christmas Cards here.

Image by unknown is from the Souvenir Post Card Company via Wikipedia and is in the public domain

A Christmas postcard with Father Christmas and some of his reindeer by unknown.

This card was published by the Souvenir Post Card Company in New York, U.S.A. in 1907. 

Christmas Stamps

A number of nations have issued commemorative stamps at Christmastide.  Postal customers will often use these stamps to mail Christmas cards, and they are popular with philatelists.  These stamps are regular postage stamps, unlike Christmas seals, and are valid for postage year-round.  They usually go on sale sometime between early October and early December and are printed in considerable quantities.

Read more about Christmas Stamps here.

Christmas Gifts

The exchanging of gifts is one of the core aspects of the modern Christmas celebration, making it the most profitable time of year for retailers and businesses throughout the world.  On Christmas, people exchange gifts based on the Christian tradition associated with Saint Nicholas, and the gifts of gold, frankincense, and myrrh which were given to the baby Jesus by the Magi.  The practice of gift giving in the Roman celebration of Saturnalia may have influenced Christian customs, but on the other hand the Christian core dogma of the Incarnation, however, solidly established the giving and receiving of gifts as the structural principle of that recurrent yet unique event, because it was the Biblical Magi, together with all their fellow men, who received the gift of God through man’s renewed participation in the divine life. However, Thomas J. Talley holds that the Roman Emperor Aurelian placed the alternate festival on December the 25th in order to compete with the growing rate of the Christian Church, which had already been celebrating Christmas on that date first.

Read more about Christmas Gifts here.

Image © of Kelvin Kay via Wikipedia

Christmas gifts under a Christmas tree.

Gift-Bearing Figures

Several figures are associated with Christmas and the seasonal giving of gifts. Among these, the best known of these figures today is the red-dressed  Father Christmas (more well-known in the United Kingdom although the American term Santa Claus is becoming more popular.  Amongst many names around the world, he is known as  Pere Noel,  Joulupukki, Babbo Natale, Ded Moroz and tomte.  The Scandinavian tomte (also called nisse) is sometimes depicted as a gnome instead of Santa Claus.   

The name Santa Claus can be traced back to the Dutch Sinterklaas (Saint Nicholas). Nicholas was a 4th-century Greek bishop of Myra, a city in the Roman province of Lycia, whose ruins are 3 kilometres (1.9 mi) from modern Demre in southwest Turkey.  Among other saintly attributes, he was noted for the care of children, generosity, and the giving of gifts.  His feast day, December the 6th, came to be celebrated in many countries with the giving of gifts.

Saint Nicholas traditionally appeared in bishop’s attire, accompanied by helpers, inquiring about the behaviour of children during the past year before deciding whether they deserved a gift or not.  By the 13th century, Saint Nicholas was well known in the Netherlands, and the practice of gift-giving in his name spread to other parts of central and southern Europe.  At the Reformation in 16th- and 17th-century Europe, many Protestants changed the gift bringer to the Christ Child or Christkindl, corrupted in English to Kris Kringle, and the date of giving gifts changed from December the 6th to Christmas Eve.

The modern popular image of Father Christmas, however, was created in the United States, and in particular in New York.  The transformation was accomplished with the aid of notable contributors including Washington Irving and the German-American cartoonist Thomas Nast (1840 – 1902).  Following the American Revolutionary War, some of the inhabitants of New York City sought out symbols of the city’s non-English past.  New York had originally been established as the Dutch colonial town of New Amsterdam and the Dutch Sinterklaas tradition was reinvented as Saint Nicholas.

Current tradition in several Latin American countries (such as Venezuela and Colombia) holds that while Father Christmas makes the toys, he then gives them to Baby Jesus, who is the one who delivers them to the children’s homes, a reconciliation between traditional religious beliefs and the iconography of Santa Claus imported from the United States.

In South Tyrol (Italy), Austria, Czech Republic, Southern Germany, Hungary, Liechtenstein, Slovakia, and Switzerland, the Christkind (Jezisek in Czech, Jezuska in Hungarian and Jezisko in Slovak) brings the presents.  Greek children get their presents from Saint Basil on New Year’s Eve, the eve of that saint’s liturgical feast.  The German St. Nikolaus is not identical to the Weihnachtsmann (who is the German version of Father Christmas).  St. Nikolaus wears a bishop’s dress and still brings small gifts (usually candies, nuts, and fruits) on December the 6th and is accompanied by Knecht Ruprecht.  Although many parents around the world routinely teach their children about Father Christmas and other gift bringers, some have come to reject this practice, considering it deceptive.

Multiple gift-giver figures exist in Poland, varying between regions and individual families. St Nicholas (Swiety Mikolaj) dominates Central and North-East areas, the Starman (Gwiazdor) is most common in Greater Poland, Baby Jesus (Dzieciątko) is unique to Upper Silesia, with the Little Star (Gwiazdka) and the Little Angel (Aniołek) being common in the South and the South-East.  Grandfather Frost (Dziadek Mroz) is less commonly accepted in some areas of Eastern Poland.  It is worth noting that across all of Poland, St Nicholas is the gift giver on Saint Nicholas Day on December the 6th.

You can read a well-known poem about St. Nicholas here.

Read more about Gift-Bearing Figures here.

Image © of CrazyPhunk via Wikipedia

Saint Nicholas.

See Also

Christmas in July – Second Christmas celebration.

Christmas Peace – Finnish tradition.

Christmas Sunday – Sunday after Christmas.

List of Christmas films.

List of Christmas novels – Christmas as depicted in literature.

Little Christmas – Alternative title for 6 January.

NochebuenaEvening or entire day before Christmas Day.

Mithraism in comparison with other belief systems.

Christmas by medium – Christmas represented in different media.

You can see notes, references, further reading and external links to the above articles here.  The above was sourced from a page on Wikipedia and is subject to change. 

Blog Posts

Links

Liliboas on iStock.  The image shown at the top of this page of a Christmas tree and presents is the copyright of Liliboas.  You can find more great work from the photographer Lili and lots more free stock photos at iStock.

The image above of a nativity scene made with Christmas lights is the copyright of Wikipedia user Crumpled Fire.  It comes with a Creative Commons licence (CC BY-SA 2.0).

The image above of the Nativity by unknown comes via Wikipedia and is in the public domain.

The image above of the Coronation of Charlemagne on Christmas of 800 by Julius Schnorr von Carolsfeld comes via Wikipedia and is in the public domain.

The image above of the Examination and Tryal of Old Father Christmas by Josiah King comes via Wikipedia and is in the public domain.

The image above of the Queen’s Christmas Tree at Windsor Castle by Joseph Lionel Williams comes via Wikipedia and is in the public domain.

The image above of a Norwegian Christmas by Adolph Tidemand comes via Wikipedia and is in the public domain.

The image above of the Christmas visit by unknown comes via Wikipedia and is in the public domain.

The image above of Adoration of the Shepherds by Gerard van Honthorst comes via Wikipedia and is in the public domain.

The image above of the  Nativity of Christ by Herrad of Landsberg comes via Wikipedia and is in the public domain.

The image above of Christmas at the Annunciation Church in Nazareth is the copyright of Wikipedia user Israel Press and Photo Agency.  It comes with a Creative Commons licence (CC BY-SA 4.0)

The image above of a typical Neapolitan Nativity scene by unknown comes from the Diocesan Museum of Sacred Art.  It comes with a Creative Commons licence (CC BY-SA 4.0)

The image above of the official White House Christmas tree for 1962 by Robert Knudsen comes from the Kennedy Library via Wikipedia and is in the public domain.

The image above of the Christ Candle in the centre of an Advent wreath is the copyright of Wikipedia user PFAStudent.  It comes with a Creative Commons licence (CC BY-SA 3.0).

The image above of children in Oklahoma reenact a Nativity play is the copyright of Wikipedia user Wesley Fryer.  It comes with a Creative Commons licence (CC BY-SA 2.0).

The image above of Christmas carolers in Jersey is copyright of Wikipedia user Man vyi and is in the public domain.

The image above of a Christmas dinner setting is the copyright of Wikipedia user Austin McGee.  It comes with a Creative Commons licence (CC BY-SA 2.0).

The image above of a Christmas postcard with Father Christmas and some of his reindeer by unknown comes via Wikipedia and is in the public domain.

The image above of  Christmas gifts under a Christmas tree is the copyright of Wikipedia user Kelvin Kay.  It comes with a Creative Commons licence (CC BY-SA 2.0).

The image above of Saint Nicholas is the copyright of Wikipedia user CrazyPhunk.  It comes with a Creative Commons licence (CC BY-SA 3.0).

Halloween

Image ©Toby Ord via Wikipedia

Growing up in England from a child to a teenager in the 1960’s, 1970’s and 1980’s, Halloween was an American thing you saw on the telly.  There was no dressing up and trick-or-treating, not in my family home anyway.  Even when my kids were younger I never really bothered much about Halloween.  It was just all too American for me and just liked the English traditions I was brought up with.  They had fun wearing masks, bobbing for apples etc. but we never went out dressed up knocking on people’s doors.  in fact, I don’t recall ever seeing anyone else do it either. 

Nowadays all of the above is a common sight.  I am no killjoy and I don’t knock anyone who really enjoys it.  I admit it’s a fun thing for kids to do and a good excuse for a party for the adults which I have enjoyed going to in the past few years.  When you have suffered from depression and anxiety for as long as I have, just to be included can be a lifesaver.

The main thing I like about Halloween is dressing up and the Horror theme to it.  I have never celebrated  Halloween in my life in the past because, since I was a kid, I have loved horror.  Every day is Halloween for me, ha ha. 

About Halloween 

Halloween or Hallowe’en (less commonly known as Allhalloween, All Hallows’ Eve, or All Saints’ Eve) is a celebration observed in many countries on the 31st of October, the eve of the Western Christian feast of All Saints’ Day.  It begins the observance of Allhallowtide, the time in the liturgical year dedicated to remembering the dead, including saints (hallows), martyrs, and all the faithful departed.

One theory holds that many Halloween traditions were influenced by Celtic harvest festivals, particularly the Gaelic festival Samhain, which is believed to have pagan roots.  Some go further and suggest that Samhain may have been Christianised as All Hallow’s Day, along with its eve, by the early Church.  Other academics believe Halloween began solely as a Christian holiday, being the vigil of All Hallow’s Day.  Celebrated in Ireland and Scotland for centuries, Irish and Scottish immigrants took many Halloween customs to North America in the 19th century, and then through American influence Halloween had spread to other countries by the late 20th and early 21st century.

Popular Halloween activities include trick-or-treating (or the related guising and souling), attending Halloween costume parties, carving pumpkins or turnips into jack-o’-lanterns, lighting bonfires, apple bobbing, divination games, playing pranks, visiting haunted attractions, telling scary stories, and watching horror or Halloween-themed films.  Some people practice the Christian religious observances of All Hallows’ Eve, including attending church services and lighting candles on the graves of the dead, although it is a secular celebration for others.  Some Christians historically abstained from meat on All Hallows’ Eve, a tradition reflected in the eating of certain vegetarian foods on this vigil day, including apples, potato pancakes, and soul cakes.  

Image ©Toby Ord via Wikipedia

A Jack o’ Lantern made for the Holywell Manor Halloween celebrations in 2003. 

Etymology  

The word Halloween or Hallowe’en (Saints’ evening) is of Christian origin.  It is a term equivalent to All Hallows Eve and is attested in Old English. It comes from the Scottish form of All Hallows’ Eve (the evening before All Hallows’ Day).  Even is the Scots term for eve or evening, and is contracted to e’en or een so (All) Hallow(s) E(v)en became Hallowe’en.   

The History Of Halloween   

Christian Origins And Historic Customs 

Halloween is thought to have influences from Christian beliefs and practices.  The English word Halloween comes from All Hallows’ Eve, being the evening before the Christian holy days of All Hallows’ Day (All Saints’ Day) on the 1st of November and All Souls’ Day on the 2nd of November.  Since the time of the early Church, major feasts in Christianity (such as Christmas, Easter and Pentecost) had vigils that began the night before, as did the feast of All Hallows’.  These three days are collectively called Allhallowtide and are a time when Western Christians honour all saints and pray for recently departed souls who have yet to reach Heaven.  Commemorations of all saints and martyrs were held by several churches on various dates, mostly in springtime.  In 4th-century Roman Edessa it was held on the 13th of May, and on this date in 609, Pope Boniface IV re-dedicated the Pantheon in Rome to St Mary and all martyrs.  This was the date of Lemuria, an ancient Roman festival of the dead.

In the 8th century, Pope Gregory III (731 – 741) founded an oratory in St. Peter’s for the relics of the holy apostles and of all saints, martyrs and confessors.  Some sources say it was dedicated on the 1st of November, while others say it was on Palm Sunday in April 732.  By 800, there was evidence that churches in Ireland and Northumbria were holding a feast commemorating all saints on November 1st.  Alcuin of Northumbria, a member of Charlemagne’s court, may then have introduced this 1st of November date in the Frankish Empire.  In 835, it became the official date in the Frankish Empire.  Some suggest this was due to Celtic influence, while others suggest it was a Germanic idea, although it is claimed that both Germanic and Celtic-speaking peoples commemorated the dead at the beginning of winter.  They may have seen it as the most fitting time to do so, as it is a time of dying in nature.  It is also suggested the change was made on the practical grounds that Rome in summer could not accommodate the great number of pilgrims who flocked to it, and perhaps because of public health concerns over Roman Fever, which claimed a number of lives during Rome’s sultry summers.

By the end of the 12th century, the celebration had become known as the holy days of obligation in Western Christianity and involved such traditions as ringing church bells for souls in purgatory.  It was also customary for criers dressed in black to parade the streets, ringing a bell of mournful sound and calling on all good Christians to remember the poor souls.  The Allhallowtide custom of baking and sharing soul cakes for all christened souls has been suggested as the origin of trick-or-treating.  The custom dates back at least as far as the 15th century and was found in parts of England, Wales, Flanders, Bavaria and Austria.  Groups of poor people, often children, would go door-to-door during Allhallowtide, collecting soul cakes, in exchange for praying for the dead, especially the souls of the givers’ friends and relatives.  This was called souling.  Soul cakes were also offered for the souls themselves to eat,  or the soulers would act as their representatives.  As with the Lenten tradition of hot cross buns, soul cakes were often marked with a cross, indicating they were baked as alms.  Shakespeare mentions souling in his comedy The Two Gentlemen of Verona (1593).  While souling, Christians would carry lanterns made of hollowed-out turnips, which could have originally represented souls of the dead.  These jack-o’-lanterns were used to ward off evil spirits.  On All Saints’ and All Souls’ Day during the 19th century, candles were lit in homes in Ireland, Flanders, Bavaria, and Tyrol, where they were called soul lights, which served to guide the souls back to visit their earthly homes.  In many of these places, candles were also lit at graves on All Souls’ Day.  In Brittany, libations of milk were poured on the graves of kinfolk, or food would be left overnight on the dinner table for the returning souls.  This custom was also found in Tyrol and parts of Italy.

Christian minister Prince Sorie Conteh linked the wearing of costumes to the belief in vengeful ghosts.  It was traditionally believed that the souls of the departed wandered the earth until All Saints’ Day, and All Hallows’ Eve provided one last chance for the dead to gain vengeance on their enemies before moving to the next world.  In order to avoid being recognised by any soul that might be seeking such vengeance, people would don masks or costumes.  In the Middle Ages, churches in Europe that were too poor to display relics of martyred saints at Allhallowtide let parishioners dress up as saints instead.  Some Christians observe this custom at Halloween today.   American historian Lesley Bannatyne believes this could have been a Christianisation of an earlier pagan custom.   Many Christians in mainland Europe, especially in France, believed that once a year, on Hallowe’en, the dead of the churchyards rose for one wild, hideous carnival known as the danse macabre, which was often depicted in church decoration. Historians Christopher Allmand and Rosamond McKitterick write in The New Cambridge Medieval History that the danse macabre urged Christians not to forget the end of all earthly things.  The danse macabre was sometimes enacted in European village pageants and court masques, with people dressing up as corpses from various strata of society, and this may be the origin of Halloween costume parties.

In Britain, these customs came under attack during the Reformation, as Protestants berated purgatory as a popish doctrine incompatible with the Calvinist doctrine of predestination.  State-sanctioned ceremonies associated with the intercession of saints and prayer for souls in purgatory were abolished during the Elizabethan reform, though All Hallow’s Day remained in the English liturgical calendar to commemorate saints as godly human beings.  For some Nonconformist Protestants, the theology of All Hallows’ Eve was redefined and said that souls cannot be journeying from Purgatory on their way to Heaven, as Catholics frequently believe and assert.  Instead, the so-called ghosts are thought to be in actuality evil spirits.  Other Protestants believed in an intermediate state known as Hades (Bosom of Abraham).  In some localities, Catholics and Protestants continued souling, candlelit processions, or ringing church bells for the dead.  The Anglican church eventually suppressed this bell-ringing.  Mark Donnelly, a professor of medieval archaeology, and historian Daniel Diehl both wrote that barns and homes were blessed to protect people and livestock from the effect of witches, who were believed to accompany the malignant spirits as they travelled the earth.  After 1605, Hallowtide was eclipsed in England by Guy Fawkes Night ( November 5th), which appropriated some of its customs.  In England, the ending of official ceremonies related to the intercession of saints led to the development of new, unofficial Hallowtide customs. In 18th – 19th century rural Lancashire, Catholic families gathered on hills on the night of All Hallows’ Eve.  One held a bunch of burning straw on a pitchfork while the rest knelt around him, praying for the souls of relatives and friends until the flames went out.  This was known as teen’lay.  There was a similar custom in Hertfordshire and the lighting of tindle fires in Derbyshire.  Some suggested these tindles were originally lit to guide the poor souls back to earth.  In Scotland and Ireland, old Allhallowtide customs that were at odds with Reformed teaching were not suppressed as they were important to the life cycle and rites of passage of local communities and curbing them would have been difficult.

In parts of Italy until the 15th century, families left a meal out for the ghosts of relatives, before leaving for church services.  In 19th-century Italy, churches staged theatrical re-enactments of scenes from the lives of the saints on All Hallow’s Day, with participants represented by realistic wax figures.  In 1823, the graveyard of Holy Spirit Hospital in Rome presented a scene in which bodies of those who recently died were arrayed around a wax statue of an angel who pointed upward towards heaven.  In the same country, parish priests went house-to-house, asking for small gifts of food which they shared among themselves throughout that night.  In Spain, they continue to bake special pastries called bones of the holy (Spanish: Huesos de Santo) and set them on graves.  At cemeteries in Spain and France, as well as in Latin America, priests lead Christian processions and services during Allhallowtide, after which people keep an all-night vigil.  In 19th-century San Sebastian, there was a procession to the city cemetery at Allhallowtide, an event that drew beggars who appealed to the tender recollections of one’s deceased relations and friends for sympathy. 

Image via Wikipedia by John Masey Wright is in the public domain

Halloween (1785) by Scottish poet Robert Burns, recounts various legends of the holiday.   

Image © unknown via Wikipedia is in the public domain

A Bangladeshi girl lighting grave candles on the headstone of a deceased relative in the city of Chittagong for the observance of Allhallowtide.

While she is doing this, her mother is praying for their passed relative. In the background, there are other Bangladeshi Christians hanging garlands on cross-shaped grave stones. 

Image © unknown via Wikipedia is in the public domain

Four young adult Lutheran Christians praying on the night of All Hallows’ Eve (Halloween) for Christian martyrs, saints, and all the faithful departed, especially their loved ones, in preparation for All Hallows’ Day (All Saints’ Day), the following day of Hallowtide.

These Swedes, as well as other believers, have also lit votive candles and hung wreaths near the crucifix by which they are solemnly praying.  This photograph was taken in the Solna Municipality of Stockholm, Sweden. 

The Geography Of Halloween  

You can read more Geography of Halloween here.  

Image © 663highland via Wikipedia

A Halloween display in Harborland, Kobe, Hyogo, Japan. 

Gaelic Folk Influence 

Today’s Halloween customs are thought to have been influenced by folk customs and beliefs from the Celtic-speaking countries, some of which are believed to have pagan roots.  Jack Santino, a folklorist, writes that “there was throughout Ireland an uneasy truce existing between customs and beliefs associated with Christianity and those associated with religions that were Irish before Christianity arrived”.  The origins of Halloween customs are typically linked to the Gaelic festival Samhain.

Samhain is one of the quarter days in the medieval Gaelic calendar and has been celebrated from October 31st to November 1st in Ireland, Scotland and the Isle of Man.  A kindred festival has been held by the Brittonic Celts, called Calan Gaeaf in Wales, Kalan Gwav in Cornwall and Kalan Goañv in Brittany. this is a name meaning the first day of winter.  For the Celts, the day ended and began at sunset, thus the festival begins the evening before the 1st of November by modern reckoning.  Samhain is mentioned in some of the earliest Irish literature.  The names have been used by historians to refer to Celtic Halloween customs up until the 19th century, and are still the Gaelic and Welsh names for Halloween.

Samhain marked the end of the harvest season and the beginning of winter or the darker half of the year.  It was seen as a liminal time when the boundary between this world and the Otherworld thinned.  This meant the Aos Sí, the spirits or fairies, could more easily come into this world and were particularly active.  Most scholars see them as degraded versions of ancient gods whose power remained active in the people’s minds even after they had been officially replaced by later religious beliefs.  They were both respected and feared, with individuals often invoking the protection of God when approaching their dwellings. At Samhain, the Aos Sí were appeased to ensure the people and livestock survived the winter.  Offerings of food and drink, or portions of the crops, were left outside for them.  The souls of the dead were also said to revisit their homes seeking hospitality.  Places were set at the dinner table and by the fire to welcome them.  The belief that the souls of the dead return home on one night of the year and must be appeased seems to have ancient origins and is found in many cultures.  In 19th century Ireland, candles would be lit and prayers formally offered for the souls of the dead.  After this, the eating, drinking, and games would begin.

Throughout Ireland and Britain, especially in the Celtic-speaking regions, the household festivities included divination rituals and games intended to foretell one’s future, especially regarding death and marriage.  Apples and nuts were often used, and customs included apple bobbing, nut roasting, scrying or mirror-gazing, pouring molten lead or egg whites into water, dream interpretation, and others.  Special bonfires were lit and there were rituals involving them.  Their flames, smoke, and ashes were deemed to have protective and cleansing powers.  In some places, torches lit from the bonfire were carried sunwise around homes and fields to protect them  It is suggested the fires were a kind of imitative or sympathetic magic – they mimicked the Sun and held back the decay and darkness of winter.  They were also used for divination and to ward off evil spirits.  In Scotland, these bonfires and divination games were banned by the church elders in some parishes.  In Wales, bonfires were also lit to prevent the souls of the dead from falling to earth. Later, these bonfires kept away the devil.

From at least the 16th century, the festival included mumming and guising in Ireland, Scotland, the Isle of Man and Wales.  This involved people going house-to-house in costume (or in disguise), usually reciting verses or songs in exchange for food.  It may have originally been a tradition whereby people impersonated the Aos Sí, or the souls of the dead, and received offerings on their behalf, similar to souling.  Impersonating these beings, or wearing a disguise, was also believed to protect oneself from them.  In parts of southern Ireland, the guisers included a hobby horse.  A man dressed as a Láir Bhán (white mare) led youths house-to-house reciting verses (some of which had pagan overtones) in exchange for food.  If the household donated food it could expect good fortune from the Muck Olla and not doing so would bring misfortune.  In Scotland, youths went house-to-house with masked, painted or blackened faces, often threatening to do mischief if they were not welcomed.   F. Marian McNeill suggests the ancient festival included people in costume representing the spirits, and that faces were marked or blackened with ashes from the sacred bonfire.  In parts of Wales, men went about dressed as fearsome beings called gwrachod.  In the late 19th and early 20th century, young people in Glamorgan and Orkney cross-dressed.

Elsewhere in Europe, mumming was part of other festivals, but in the Celtic-speaking regions, it was particularly appropriate to a night upon which supernatural beings were said to be abroad and could be imitated or warded off by human wanderers.  From at least the 18th century, imitating malignant spirits led to playing pranks in Ireland and the Scottish Highlands.  Wearing costumes and playing pranks at Halloween did not spread to England until the 20th century.  Pranksters used hollowed-out turnips or mangel wurzels as lanterns, often carved with grotesque faces.  By those who made them, the lanterns were variously said to represent the spirits, or used to ward off evil spirits.  They were common in parts of Ireland and the Scottish Highlands in the 19th century, as well as in Somerset, known as Punkie Night.  In the 20th century, they spread to other parts of Britain and became generally known as jack-o’-lanterns.  

Image © Rannpháirtí anaithnid via Wikipedia

A traditional Irish Halloween mask.

This early 20th-century mask is displayed at the Museum of Country Life in Ireland.  

Image by Daniel Maclise via Wikipedia is in the public domain

Snap-Apple Night, painted by Irish artist Daniel Maclise in 1833.

It shows people feasting and playing divination games on Halloween in Ireland.   It was inspired by a Halloween party he attended in Blarney, in 1832.   

Image © Rannphairti anaithnid via Wikipedia

A traditional Irish Jack-o’-lantern.

This plaster cast of a Halloween turnip (rutabaga) lantern is on display in the Museum of Country Life in Ireland.

Spread To North America 

Lesley Bannatyne and Cindy Ott write that Anglican colonists in the southern United States and Catholic colonists in Maryland recognised All Hallow’s Eve in their church calendars, although the Puritans of New England strongly opposed the holiday, along with other traditional celebrations of the established Church, including Christmas.  Almanacs of the late 18th and early 19th century give no indication that Halloween was widely celebrated in North America.

It was not until after mass Irish and Scottish immigration in the 19th century that Halloween became a major holiday in America.  Most American Halloween traditions were inherited from the Irish and Scots, though in Cajun areas, a nocturnal Mass was said in cemeteries on Halloween night.  Candles that had been blessed were placed on graves, and families sometimes spent the entire night at the graveside.  Originally confined to these immigrant communities, it was gradually assimilated into mainstream society and was celebrated coast to coast by people of all social, racial, and religious backgrounds by the early 20th century.   Then, through American influence, these Halloween traditions spread to many other countries by the late 20th and early 21st century, including to mainland Europe and some parts of the Far East. 

Image © InSapphoWeTrust via Wikipedia

The Greenwich Village Halloween Parade.

This annual Halloween Parade takes place in New York, U.S.A. and it heads up Sixth Avenue.  It’s hard to top this when it comes to Halloween, whether in New York City or anywhere else.  This group is doing the mass zombie dance as seen in Michael Jackson’s Thriller music video.    

Symbols  

Development of artefacts and symbols associated with Halloween formed over time.  Jack-o’-lanterns are traditionally carried by guisers on All Hallows’ Eve in order to frighten evil spirits.  There is a popular Irish Christian folktale associated with the jack-o’-lantern, which in folklore is said to represent a soul who has been denied entry into both heaven and hell. 

The folktale says that on route home after a night’s drinking, Jack encounters the Devil and tricks him into climbing a tree.  A quick-thinking Jack etches the sign of the cross into the bark, thus trapping the Devil.   Jack strikes a bargain that Satan can never claim his soul.  After a life of sin, drink, and mendacity, Jack is refused entry to heaven when he dies.  Keeping his promise, the Devil refuses to let Jack into hell and throws a live coal straight from the fires of hell at him.  It was a cold night, so Jack placed the coal in a hollowed-out turnip to stop it from going out, since which time Jack and his lantern had been roaming looking for a place to rest.

In Ireland and Scotland, the turnip has traditionally been carved during Halloween, but immigrants to North America used the native pumpkin, which is both much softer and much larger, making it easier to carve than a turnip. The American tradition of carving pumpkins was recorded in 1837 and was originally associated with harvest time in general, not becoming specifically associated with Halloween until the mid-to-late 19th century.  

Image © Anthony22 via Wikipedia

Outdoor Halloween decorations.  

Image © Smallbones via Wikipedia and is in the public domain

A decorated house in Weatherly, Carbon County, Pennsylvania. 

Trick-Or-Treating And Guising 

You can read more about trick-or-treating here.

Trick-or-treating is a customary celebration for children on Halloween.  Children go in costume from house to house usually getting sweet treats or sometimes money, asking the question, “Trick or treat?” The word trick implies a they will perform mischief on the homeowners or their property if no treat is given.  The practice is said to have roots in the medieval practice of mumming, which is closely related to soulingJohn Pymm wrote that “many of the feast days associated with the presentation of mumming plays were celebrated by the Christian Church.” These feast days included All Hallows’ Eve, Christmas, Twelfth Night and Shrove Tuesday.  Mumming practised in Germany, Scandinavia and other parts of Europe, involved masked persons in fancy dress who paraded the streets and entered houses to dance or play dice in silence.

In England, from the medieval period, up until the 1930’s, people practiced the Christian custom of souling on Halloween, which involved groups of soulers, both Protestant and Catholic, going from parish to parish, begging the rich for soul cakes, in exchange for praying for the souls of the givers and their friends.  In the Philippines, the practice of souling is called Pangangaluluwa and is practised on All Hallow’s Eve among children in rural areas.  People drape themselves in white cloths to represent souls and then visit houses, where they sing in return for prayers and sweets.

In Scotland and Ireland, guising is a traditional Halloween custom.  This is where children disguised in costume go from door to door for food or coins.  It is recorded in Scotland at Halloween in 1895 where masqueraders in disguise carrying lanterns made out of scooped-out turnips, visit homes to be rewarded with cakes, fruit, and money.  In Ireland, the most popular phrase for kids to shout (until the 2000’s) was “Help the Halloween Party”.  The practice of guising at Halloween in North America was first recorded in 1911, when a newspaper in Kingston, Ontario, Canada, reported children going guising around the neighbourhood.

American historian and author Ruth Edna Kelley of Massachusetts wrote the first book-length history of Halloween in the U.S.A. titled The Book of Hallowe’en (1919), and references souling in the chapter Hallowe’en in America.  In her book, Kelley touches on customs that arrived from across the Atlantic, she said, “Americans have fostered them, and are making this an occasion something like what it must have been in its best days overseas. All Halloween customs in the United States are borrowed directly or adapted from those of other countries”.

While the first reference to guising in North America occurs in 1911, another reference to ritual begging on Halloween appears, place unknown, in 1915, with a third reference in Chicago in 1920.  The earliest known use in print of the term trick or treat appears in 1927, in the Blackie Herald, of Alberta, Canada.

The thousands of Halloween postcards produced between the turn of the 20th century and the 1920’s commonly show children but not trick-or-treating.  Trick-or-treating did not seem to have become a widespread practice in North America until the 1930’s, with the first U.S.A. appearances of the term in 1934, and the first use in a national publication occurring in 1939.

A popular variant of trick-or-treating, known as trunk-or-treating (or Halloween tailgating), occurs when children are offered treats from the trunks (or boot as we say in the U.K.) of cars parked in a church parking lot, or sometimes, a school parking lot.  In a trunk-or-treat event, the boot of each car is decorated with a certain theme, such as those of children’s literature, movies, scripture, and job roles.  Trunk-or-treating has grown in popularity due to its perception as being safer than going door to door, a point that resonates well with parents, as well as the fact that it solves the rural conundrum in which homes are built a half-mile apart.  

Image © ToyahAnette B via Wikipedia and is in the public domain

Trick-or-treaters in Sweden. 

Image © unknown via Wikipedia and is in the public domain

A girl in a Halloween costume at Waterdown Public School, Waterdown, Ontario, Canada in 1928.

Waterdown is the same province where the Scottish Halloween custom of guising was first recorded in North America.  

Image © unknown via Wikipedia and is in the public domain

A Trunk-Or-Treat Event In Darien, Illinois, U.S.A.

This event is at the Lutheran Church and Early Learning Center in Darien.  This particular car has a jack-o’-lantern theme.   

Costumes  

Read more about Halloween costumes here.  You can see the Halloween costumes I have worn over the years here.

Halloween costumes were traditionally modelled after figures such as vampires, ghosts, skeletons, scary-looking witches, and devils.  Over time, the costume selection extended to include popular characters from fiction, celebrities, and generic archetypes such as ninjas and princesses.

Dressing up in costumes and going guising was prevalent in Scotland and Ireland at Halloween by the late 19th century.  A Scottish term, the tradition is called guising because of the disguises or costumes worn by the children.  In Ireland and Scotland, the masks are known as false faces, a term recorded in Ayr, Scotland in 1890 by a Scot describing guisers.  He said, “I had mind it was Halloween.  The wee callans (boys) were at it already, rinning aboot wi’ their fause-faces (false faces) on and their bits o’ turnip lanthrons (lanterns) in their haun (hand)”.  Costuming became popular for Halloween parties in the U.S.A. in the early 20th century, as often for adults as for children, and when trick-or-treating was becoming popular in Canada and the U.S.A. in the 1920’s and 1930’s.

Eddie J. Smith, in his book Halloween, Hallowed is Thy Name, offers a religious perspective to the wearing of costumes on All Hallows’ Eve, suggesting that by dressing up as creatures who at one time caused us to fear and tremble, people are able to poke fun at Satan whose kingdom has been plundered by Jesus.  Images of skeletons and the dead are traditional decorations used as memento more.

The yearly New York’s Village Halloween Parade was begun in 1974 and it is the world’s largest Halloween parade and America’s only major nighttime parade, attracting more than 60,000 costumed participants, two million spectators, and a worldwide television audience. 

Image © Ardfern via Wikipedia

A Halloween shop in Waterloo Street, Derry, County Londonderry, Northern Ireland, selling masks in 2010.  

Image © Universal via Universal Studios and Trick Or Treat Studios
Image © Universal via Universal Studios and Trick Or Treat Studios
Image © Universal via Universal Studios and Trick Or Treat Studios

The EXCELLENT Frankenstein mask from Trick Or Treat Studios.

This is a very cool Universal Classic Monsters mask I purchased for Halloween 2023.  It is officially licenced by Universal Studios and made for Trick Or Treat Studios.  It is, to date, the favourite mask I have in my mask collection and what I have worn for Halloween parties.  To see me in this and many more masks click here.

Pet Costumes  

According to a 2018 report from the National Retail Federation, 30 million Americans will spend an estimated $480 million on Halloween costumes for their pets in 2018.  This is up from an estimated $200 million in 2010.  The most popular costumes for pets are the pumpkin, followed by the hot dog, and the bumblebee in third place.   

Games And Other Activities 

There are several games traditionally associated with Halloween.  Some of these games originated as divination rituals or ways of foretelling one’s future, especially regarding death, marriage and children.  During the Middle Ages, these rituals were done by a rare few in rural communities as they were considered to be deadly serious practices.  In recent centuries, these divination games have been a common feature of the household festivities in Ireland and Britain.  They often involve apples and hazelnuts.  In Celtic mythology, apples were strongly associated with the Otherworld and immortality, while hazelnuts were associated with divine wisdom.  Some also suggest that they derive from Roman practices in celebration of Pomona.

The following activities were a common feature of Halloween in Ireland and Britain during the 17th – 20th centuries.  Some have become more widespread and continue to be popular today.  One common game is apple bobbing or dunking (which may be called dooking in Scotland) in which apples float in a tub or a large basin of water and the participants must use only their teeth to remove an apple from the basin.  A variant of dunking involves kneeling on a chair, holding a fork between the teeth and trying to drive the fork into an apple.  Another common game involves hanging up treacle or syrup-coated scones by strings.  These must be eaten without using hands while they remain attached to the string, an activity that inevitably leads to a sticky face.  Another once-popular game involves hanging a small wooden rod from the ceiling at head height, with a lit candle on one end and an apple hanging from the other.  The rod is spun round and everyone takes turns to try to catch the apple with their teeth.

Several of the traditional activities from Ireland and Britain involve foretelling one’s future partner or spouse.  An apple would be peeled in one long strip, then the peel tossed over the shoulder.  The peel is believed to land in the shape of the first letter of the future spouse’s name.  Two hazelnuts would be roasted near a fire, one named for the person roasting them and the other for the person they desire.  If the nuts jump away from the heat, it is a bad sign, but if the nuts roast quietly it foretells a good match.  A salty oatmeal bannock would be baked and the person would eat it in three bites and then go to bed in silence without anything to drink.  This is said to result in a dream in which their future spouse offers them a drink to quench their thirst.  Unmarried women were told that if they sat in a darkened room and gazed into a mirror on Halloween night, the face of their future husband would appear in the mirror.  The custom was widespread enough to be commemorated on greeting cards from the late 19th century and early 20th century.

Another popular Irish game was known as púicíní (blindfolds).  This involves a person being blindfolded and then they would choose between several saucers.  The item in the saucer would provide a hint as to their future.  A ring would mean that they would marry soon, clay meant that they would die soon (perhaps within the year), water meant that they would emigrate, rosary beads meant that that they would take Holy Orders (become a nun, priest, monk, etc.), a coin meant that they would become rich and a bean meant that they would be poor.  The game features prominently in the James Joyce short story Clay (1914).

In Ireland and Scotland, items would be hidden in food (usually a cake, barmbrack, cranachan, champ or colcannon) and portions of it served out at random.  A person’s future would be foretold by the item they happened to find,  for example, a ring meant marriage and a coin meant wealth.

Up until the 19th century, the Halloween bonfires were also used for divination in parts of Scotland, Wales and Brittany. When the fire died down, a ring of stones would be laid in the ashes, one for each person.  In the morning, if any stone was mislaid it was said that the person it represented would not live out the year.

Telling ghost stories, listening to Halloween-themed songs and watching horror films are common fixtures of Halloween parties.  Episodes of television series and Halloween-themed specials (with the specials usually aimed at children) are commonly aired on or before Halloween, while new horror films are often released before Halloween to take advantage of the holiday.  

Image by unknown via Wikipedia and is in the public domain

A 1904 Halloween greeting card.

This early 20th-century card divination depicts a young woman looking into a mirror in a darkened room in hopes of catching a glimpse of her future husband.

Image by Charles F. Lester via Wikipedia and is in the public domain

Children bobbing for apples on Halloween.

The image above is from the book titled Hallowe’en at Merryvale, which was written by Alice Hale Burnett and illustrated by Charles F. Lester in 1916.  It comes from The Project Gutenberg and can be found by clicking here

Image by unknown via Wikipedia and is in the public domain

A Halloween gathering.

The image above is from the book titled The Book of Hallowe’en, which was written by Ruth Edna Kelley and illustrated by unknown in 1919.  It comes from The Project Gutenberg and can be found by clicking here

Haunted Attractions  

You can read more about haunted attractions here.

Haunted attractions are entertainment venues designed to thrill and scare their customers.  Most attractions are seasonal Halloween businesses that may include haunted houses etc. and the level of sophistication of the effects has risen as the industry has grown.

The first recorded purpose-built haunted attraction was the Orton and Spooner Ghost House, which opened in 1915 in Liphook, England.  This attraction actually most closely resembles a carnival fun house, powered by steam.  The House still exists, in the Hollycombe Steam Collection.

It was during the 1930’s, about the same time as trick-or-treating, that Halloween-themed haunted houses first began to appear in America.  It was in the late 1950’s that haunted houses as a major attraction began to appear, focusing first on California.  Sponsored by the Children’s Health Home Junior Auxiliary, the San Mateo Haunted House opened in 1957.  The San Bernardino Assistance League Haunted House opened in 1958.  Home haunts began appearing across the country during 1962 and 1963.  In 1964, the San Manteo Haunted House opened, as well as the Children’s Museum Haunted House in Indianapolis.

The haunted house as an American cultural icon can be attributed to the opening of The Haunted Mansion in Disneyland on the 12th of August 1969.  Knott’s Berry Farm began hosting its own Halloween night attraction, Knott’s Scary Farm, which opened in 1973.  Evangelical Christians adopted a form of these attractions by opening one of the first hell houses in 1972.

The first Halloween haunted house run by a nonprofit organization was produced in 1970 by the Sycamore-Deer Park Jaycees in Clifton, Ohio.  It was co-sponsored by W.S.A.I. (an AM radio station broadcasting out of Cincinnati, Ohio).  It was last produced in 1982.  Other Jaycees followed suit with their own versions after the success of the Ohio house.  The March of Dimes copyrighted a Mini haunted house for the March of Dimes in 1976 and began fundraising through their local chapters by conducting haunted houses soon after.  Although they apparently quit supporting this type of event nationally sometime in the 1980’s, some March of Dimes haunted houses have persisted until today.

On the evening of May 11th, 1984, in Jackson Township, New Jersey, the Haunted Castle at Six Flags Great Adventure caught fire.  As a result of the fire, eight teenagers perished.  The backlash to the tragedy was a tightening of regulations relating to safety, building codes and the frequency of inspections of attractions nationwide.  The smaller venues, especially the nonprofit attractions, were unable to compete financially, and the better-funded commercial enterprises filled the vacuum.  Facilities that were once able to avoid regulation because they were considered to be temporary installations now had to adhere to the stricter codes required of permanent attractions.

In the late 1980’s and early 1990’s, theme parks entered the business seriously.  Six Flags Fright Fest began in 1986 and Universal Studios Florida began Halloween Horror Nights in 1991.  Knott’s Scary Farm experienced a surge in attendance in the 1990’s as a result of America’s obsession with Halloween as a cultural event.  Theme parks have played a major role in globalizing the holiday.  Universal Studios Singapore and Universal Studios Japan both participate, while Disney now mounts Mickey’s Not-So-Scary Halloween Party events at its parks in Paris, Hong Kong and Tokyo, as well as in the United States.  The theme park haunts are by far the largest, both in scale and attendance. 

Image © AgadaUrbanit via Wikipedia and is in the public domain

Humorous tombstones for Halloween.

These were in front of a house with a haunted house theme in Northern California, U.S.A. 

A humorous Halloween window display window in Historic 25th Street, Ogden, Utah, U.S.A.  

Food 

On All Hallows’ Eve, many Western Christian denominations encourage abstinence from meat, giving rise to a variety of vegetarian foods associated with this day.

Because in the Northern Hemisphere Halloween comes in the wake of the yearly apple harvest, toffee apples (known as candy apples or taffy apples in the U.S.A.) and caramel apples are Halloween treats made by rolling whole apples in a sticky sugar syrup, or caramel, sometimes followed by rolling them in nuts or other small savouries or confections and allowing them to cool.

One custom that persists in modern-day Ireland is the baking (or more often nowadays, the purchase) of a barmbrack (báirín breac), which is a light fruitcake, into which a plain ring, a coin, and other charms are placed before baking.  It is considered fortunate to be the lucky one who finds it.  It has also been said that those who get a ring will find their true love in the ensuing year.  This is similar to the tradition of king cake at the festival of Epiphany.

Halloween-themed foods are also produced by companies in the lead-up to the night, for example, when Cadbury releases Goo Heads (similar to Creme Eggs) in spooky wrapping.

Here are some foods associated with Halloween around the world:

Barmbrack.

Bonfire toffee.

Candy apples.

Candy corn.

Candy pumpkins.

Caramel apples.

Caramel corn.

Chocolate.

Colcannon.

Halloween cake.

Monkey nuts (peanuts in their shells).

Novelty sweets/candy shaped like skulls, pumpkins, bats, worms, etc.

Pumpkin Pie.

Roasted pumpkin seeds.

Roasted sweet corn.

Soul cakes.

Sweets/candy.

Toffee apples. 

Image © Raysonho via Wikipedia and is in the public domain

Pumpkins for sale during Halloween. 

Image © Evan-Amos via Wikipedia

A toffee apple with peanuts. 

Image © Joseolgon via Wikipedia

A jack-o’-lantern Halloween cake with a witches hat.

This cake was made in Braga, Portugal. 

See Also 

Campfire story.

Devil’s Night.

Dziady.

Ghost Festival.

Naraka Chaturdashi.

Kekri.

List of fiction works about Halloween.

List of films set around Halloween.

List of Halloween television specials.

Martinisingen.

Neewollah.

St. John’s Eve.

Walpurgis Night.

Will-o’-the-wisp.

English festivals.

The above articles and the rest of the images on this page were sourced from Wikipedia and are subject to change.

Read more about Halloween and notes etc. regarding the above post here

Blog Posts

Notes And Links

The image shown above of a carved pumpkin is the copyright of Wikipedia user Toby Ord.  It comes with a Creative Commons licence (CC BY-SA 2.5).  

The image above by  John Masey Wright is via Wikipedia and is in the public domain.

The image above of a Bangladeshi girl lighting grave candles on the headstone of a deceased relative in the city of Chittagong for the observance of Allhallowtide via Wikipedia is copyright unknown and is in the public domain.

The image above of four young adult Lutheran Christians praying on the night of All Hallows’ Eve via Wikipedia is copyright unknown and is in the public domain.

The image shown above of a traditional Irish Halloween mask is the copyright of Wikipedia user Rannpháirtí anaithnid.  It comes with a Creative Commons licence (CC BY-SA 3.0)  

The image above of Snap-Apple Night, painted by Irish artist Daniel Maclise in 1833 is via Wikipedia and is in the public domain.

The image shown above of a traditional Irish Jack-o’-lantern is the copyright of Wikipedia user Rannpháirtí anaithnid.  It comes with a Creative Commons licence (CC BY-SA 3.0)  

The image shown above of the Greenwich Village Halloween parade is the copyright of Wikipedia user InSapphoWeTrust (Scarlet Sappho).   It comes with a Creative Commons licence (CC BY-SA 3.0) You can find more great work from her by clicking here.

The image shown above of outdoor Halloween decorations is the copyright of Wikipedia user Anthony22.  It comes with a Creative Commons licence (CC BY-SA 3.0)

The image above of a decorated house in Weatherly, Carbon County, Pennsylvania, U.S.A. is the copyright of Wikipedia user Smallbones and is in the public domain. You can find more of the user’s great work by clicking here.

The image above of trick-or-treaters in Sweden is the copyright of Wikipedia user ToyahAnetteB and is in the public domain.

The image above of a girl in a Halloween costume at Waterdown Public School, Waterdown, Ontario, Canada in 1928 via Wikipedia is copyright unknown and is in the public domain.

The image above of a Trunk-Or-Treat Event In Darien, Illinois, U.S.A. via Wikipedia is copyright unknown and is in the public domain.

The image shown above of a Halloween shop in Derry, Northern Ireland, selling masks is the copyright of Wikipedia user Ardfern.  It comes with a Creative Commons licence (CC BY-SA 3.0)  

The image above of a 1904 Halloween greeting card is by unknown via Wikipedia and is in the public domain.

The image above of a Halloween gathering is by unknown via Wikipedia and is in the public domain.

The image shown above of Humorous tombstones for Halloween is the copyright of Wikipedia user AgadaUrbanit and is in the public domain.

The image shown above of Pumpkins for sale during Halloween is the copyright of Wikipedia user Raysonho and is in the public domain.

The image shown above of a toffee apple with peanuts is the copyright of Wikipedia user Evan-Amos.  It comes with a Creative Commons licence (CC BY-SA 3.0)You can find more of the user’s great work by clicking here.

The image shown above of a jack-o’-lantern Halloween cake with a witches hat is the copyright of Wikipedia user Joseolgon.  It comes with a Creative Commons licence (CC BY-SA 4.0).  

The image shown above of a Halloween display in Harborland, Kobe, Hyogo, Japan is the copyright of Wikipedia user 663highland.  It comes with a Creative Commons licence (CC BY-SA 2.5)You can find more of the user’s great work by clicking here.

Creative Commons – Official website.  They offer better sharing, advancing universal access to knowledge and culture, and fostering creativity, innovation, and collaboration. 

Universal Pictures – U.K. official website.

Universal Pictures on YouTube.

Universal Pictures on Facebook.

Universal Pictures on Twitter.

Universal Studios – Official website.

Universal Studios on YouTube.

Universal Studios on Facebook.

Universal Studios on Twitter.

Trick Or Treat Studios – Official website.

Trick Or Treat Studios on YouTube.

Trick Or Treat Studios on Facebook.

Trick Or Treat Studios on Twitter.

Trick Or Treat Studios on Instagram.

Trick Or Treat Studios on TikTok.

Wikipedia – Official website.  Wikipedia is a free online encyclopedia that anyone can edit in good faith. Its purpose is to benefit readers by containing information on all branches of knowledge.  Hosted by the Wikimedia Foundation, it consists of freely editable content, whose articles also have numerous links to guide readers to more information.   

Bonfire Night

Image © Frank Parker

Bonfire Night always brings back happy memories over the decades, especially family ones from when I was younger in the 70’s and ’80s.  

It is a great tradition that brings people together to watch a bonfire and/or watch fireworks and/or (for many) have a party with food and drink.

I don’t like being too near a fire as the flames have always been quite scary and made me nervous since I was younger but it fascinates me too, watching the shapes in the flames, the different colours and listening to the sounds of it are mesmerising.

When it is just me and I have a bonfire at home, it is a chance to sit by it (weather permitting), have some baked potatoes and reminisce about the Bonfire Night’s that has passed in time.  

I think of the times I have made/helped make a Guy over the years.  They have been filled with loads of leaves out of the gardens, newspaper and old clothes. 

Once (in the 70’s) I glued a Guy Fawkes mask on an old cereal packet cut out from my favourite comic, Whoopee! You can see the design below.

I used to like going out with my Sister Julie and Brother Bill to do Penny For The Guy. 

I remember having sparklers and writing my name in the dark night (although I wore gloves as they scared me and still this day I am not a great fan of them and can’t hold one). 

I remember my Dad keeping fireworks in a biscuit tin, chestnuts cooked in the bonfire ashes, and my Mom bringing sweets out in another tin and piping hot baked potatoes wrapped in foil in another tin ready to add loads of butter/margarine, yummy!

I think of when my son Frank Jnr. and Daughter Debbie were younger and taking them to the bonfires at their Nan and Grandad’s and having bonfires with them at home (when it was possible).  I remember when they were older and left home but came to visit and share the tradition with me.  Jnr. as came with my Grandson Tyler and Deb came with my Grandaughter Kasey (when my grandkids were younger) and on those occasions, Mom was there all excited when the fireworks went off. 

Speaking of fireworks I remember one time at a bonfire night at home, Dad picked up a jumping jack and thinking it was dead threw it in the bonfire and it shot out and hit the wall behind and above to the left of me and a friend, by a few feet.  Luck was on our side that day, ha ha.

All these are wonderful memories now Mom and Dad are no longer with us.

Although it will never be as magical as it was back in the day, it is a tradition that I will celebrate at home by having a bonfire whatever the size of it (if there is anything to burn that is), have baked potatoes and finish Bonfire Night watching V For Vendetta as long as I can.  Traditions mean a lot to me.  

About Bonfire Night 

Bonfire Night, also known as Guy Fawkes Night, Guy Fawkes Day, and Fireworks Night, is an annual commemoration observed on the 5th of November, primarily in Great Britain, involving bonfires and fireworks displays.  Its history begins with the events of the 5th of November, 1605, when Guy Fawkes, a member of the Gunpowder Plot, was arrested while guarding explosives the plotters had placed beneath the House of Lords.  The Catholic plotters had intended to assassinate Protestant king James I and his parliament.  Celebrating that the king had survived, people lit bonfires around London.  Months later, the Observance of 5th of November Act mandated an annual public day of thanksgiving for the plot’s failure.

Within a few decades Gunpowder Treason Day, as it was known, became the predominant English state commemoration. As it carried strong Protestant religious overtones it also became a focus for anti-Catholic sentiment.  Puritans delivered sermons regarding the perceived dangers of popery, while during increasingly raucous celebrations common folk burnt effigies of popular hate figures, such as the Pope.  Towards the end of the 18th century reports appeared of children begging for money with the effigies of Guy Fawkes and the 5th of November gradually became known as Guy Fawkes Day.  Towns such as Lewes and Guildford were in the 19th-century scenes of increasingly violent class-based confrontations, fostering traditions those towns celebrate still, albeit peaceably.  In the 1850’s changing attitudes resulted in the toning down of much of the day’s anti-Catholic rhetoric, and the Observance of the 5th November Act was repealed in 1859.  Eventually, the violence was dealt with, and by the 20th century, Guy Fawkes Day had become an enjoyable social commemoration, although lacking much of its original focus.  The present-day Bonfire Night is usually celebrated at large organised events.

Settlers exported Guy Fawkes Night to overseas colonies, including some in North America, where it was known as Pope Day.  Those festivities died out with the onset of the American Revolution.  Claims that Guy Fawkes Night was a Protestant replacement for older customs such as Samhain are disputed. 

Image by Paul Sanby and is in the public domain

Festivities in Windsor Castle during Guy Fawkes night in 1776.

This is by artist Paul Sanby and is one of a group of four prints of Windsor Castle. 

The Origins And History Of Bonfire Night 

Guy Fawkes Night originates from the Gunpowder Plot of 1605, a failed attempt by a group of provincial English Catholics to assassinate the Protestant King James I of England and VI of Scotland and replace him with a Catholic head of state.  In the immediate aftermath of the November 5th arrest of Guy Fawkes, caught guarding a cache of explosives placed beneath the House of Lords, James’s Council allowed the public to celebrate the king’s survival with bonfires, so long as they were without any danger or disorder.  This made 1605 the first year the plot’s failure was celebrated.

The following January, days before the surviving conspirators were executed, Parliament, at the initiation of James I, passed the Observance of 5th November Act, commonly known as the Thanksgiving Act.  It was proposed by a Puritan Member of Parliament, Edward Montagu, who suggested that the king’s apparent deliverance by divine intervention deserved some measure of official recognition, and kept the 5th of November free as a day of thanksgiving while in theory making attendance at Church mandatory.  A new form of service was also added to the Church of England’s Book of Common Prayer, for use on that date.  Little is known about the earliest celebrations.  In settlements such as Carlisle, Norwich, and Nottingham, corporations (town governments) provided music and artillery salutes. Canterbury celebrated the 5th of November, 1607 with 106 pounds (48 kg) of gunpowder and 14 pounds (6.4 kg) of match, and three years later food and drink were provided for local dignitaries, as well as music, explosions, and a parade by the local militia.  Even less is known of how the occasion was first commemorated by the general public, although records indicate that in the Protestant stronghold of Dorchester a sermon was read, the church bells rung, and bonfires and fireworks lit.  

Image © William Warby via Wikipedia

A Guy Fawkes wax model being burned on a bonfire. 

This was at the Billericay Fireworks Spectacular in Lake Meadows Park, Billericay, Essex, England.

Image © Frank Parker

Guy Fawkes on Bonfire Night, 2016.

This is a Guy Fawkes I made for a bonfire I had when I was living in my house in Kitts, Green, Birmingham, England.  It isn’t as spectacular as the one above and it could have been better but it was a last-minute project made in around two hours.  He was held together by duct tape, sellotape and safety pins but he looked cool in his cardboard V for Vendetta mask and his Wii remote lightsaber (he was a modern-day Guy who loves Sci-Fi) ha ha. 

Image © Brian Walker via Whoopee! and great News For All Readers!

A Guy Fawkes mask from Whoopee! dated 28/10/1978.

There have been a few masks printed of Guy Fawkes in the comic Whoopee!, my favourite in the 1970’s and 1980’s,  but this one is the one that I used for a family-made Guy in the 70’s. 

Read about Whoopee! and lots of great old comics from my childhood here.  

Early Significance   

According to historian and author Antonia Fraser, a study of the earliest sermons preached demonstrates an anti-Catholic concentration mystical in its fervour.  Delivering one of five 5th of November sermons printed in A Mappe of Rome in 1612, Thomas Taylor said that Fawkes’s cruelty had been almost without bounds.  Such messages were also spread in printed works such as Francis Herring’s Pietas Pontifica (republished in 1610 as Popish Piety), and John Rhode’s A Brief Summe of the Treason intended against the King & State.  By the 1620’s the Fifth was honoured in market towns and villages across the country, though it was some years before it was commemorated throughout England.  Gunpowder Treason Day, as it was then known, became the predominant English state commemoration.  Some parishes made the day a festive occasion, with public drinking and solemn processions.  Concerned though about James’s pro-Spanish foreign policy, the decline of international Protestantism, and Catholicism in general, Protestant clergymen who recognised the day’s significance called for more dignified and profound thanksgivings each November the 5th.

What unity English Protestants had shared in the plot’s immediate aftermath began to fade when in 1625 James’s son, the future Charles I, married the Catholic Henrietta Maria of France.  Puritans reacted to the marriage by issuing a new prayer to warn against rebellion and Catholicism, and on the 5th of November that year, effigies of the pope and the devil were burnt, the earliest such report of this practice and the beginning of centuries of tradition.  During Charles’s reign, Gunpowder Treason Day became increasingly partisan.  Between 1629 and 1640 he ruled without Parliament, and he seemed to support Arminianism, regarded by Puritans such as Henry Burton as a step toward Catholicism.  By 1636, under the leadership of the Arminian Archbishop of Canterbury William Laud, the English church was trying to use November the 5th to denounce all seditious practices, and not just popery.  Puritans went on the defensive, some pressing for further reformation of the Church.

Bonfire Night assumed a new fervour during the events leading up to the English Interregnum.  Although Royalists disputed their interpretations, Parliamentarians began to uncover or fear new Catholic plots.  Preaching before the House of Commons on the 5th of November 1644, Charles Herle claimed that Papists were tunnelling “from Oxford, Rome, Hell, to Westminster, and there to blow up, if possible, the better foundations of your houses, their liberties and privileges”.  

Following Charles I’s execution in 1649, the country’s new republican regime remained undecided on how to treat November the 5th.  Unlike the old system of religious feasts and State anniversaries, it survived, but as a celebration of parliamentary government and Protestantism, and not of monarchy.  Commonly the day was still marked by bonfires and miniature explosives, but formal celebrations resumed only with the Restoration, when Charles II became king.  Courtiers, High Anglicans and Tories followed the official line.   Generally, the celebrations became more diverse.  By 1670 London apprentices had turned the 5th of November into a fire festival, attacking not only popery but also sobriety and good order, demanding money from coach occupants for alcohol and bonfires.  The burning of effigies, largely unknown to the Jacobeans, continued in 1673 when Charles’s brother, the Duke of York, converted to Catholicism.  In response, accompanied by a procession of about 1,000 people, the apprentices fired an effigy of the Whore of Babylon, bedecked with a range of papal symbols.  Similar scenes occurred over the following few years.  On the 17th of November 1677, anti-Catholic fervour saw the Accession Day marked by the burning of a large effigy of the pope (his belly was filled with live cats) and two effigies of devils whispering in his ear.  Two years later, as the exclusion crisis reached its zenith, an observer noted that the 5th at night, being gunpowder treason, there were as many bonfires and burning of popes as had ever been seen.  Violent scenes in 1682 forced London’s militia into action, and to prevent any repetition the following year a proclamation was issued, banning bonfires and fireworks.

Fireworks were also banned under James II (previously the Duke of York), who became king in 1685.  Attempts by the government to tone down Gunpowder Treason Day celebrations were, however, largely unsuccessful, and some reacted to a ban on bonfires in London (born from a fear of more burnings of the pope’s effigy) by placing candles in their windows as a witness against Catholicism.  When James was deposed in 1688 by William of Orange – who, importantly, landed in England on November the 5th and the day’s events turned also to the celebration of freedom and religion, with elements of anti-Jacobitism.  While the earlier ban on bonfires was politically motivated, a ban on fireworks was maintained for safety reasons. 

Guy Fawkes Day  

William III’s birthday fell on the 4th of November, and for an orthodox Whig, the two days therefore became an important double anniversary.  William ordered that the Thanksgiving service for the 5th of November be amended to include thanks for his “happy arrival” and “the Deliverance of our Church and Nation”.  In the 1690’s he re-established Protestant rule in Ireland, and the Fifth, occasionally marked by the ringing of church bells and civic dinners was consequently eclipsed by his birthday commemorations.  From the 19th century, November the 5th celebrations there became sectarian in nature.  Its celebration in Northern Ireland remains controversial, unlike in Scotland where bonfires continue to be lit in various cities.  In England though, as one of 49 official holidays, for the ruling class, the 5th of November became overshadowed by events such as the birthdays of Admiral Edward Vernon, or John Wilkes, and under George II and George III, with the exception of the Jacobite Rising of 1745, it was largely a polite entertainment rather than an occasion for vitriolic thanksgiving.  For the lower classes, however, the anniversary was a chance to pit disorder against order, a pretext for violence and uncontrolled revelry.  In 1790 newspaper The Times reported instances of children begging for money for Guy Fawkes. 

Lower-class rioting continued, with reports in Lewes of annual rioting, intimidation of respectable householders and the rolling through the streets of lit tar barrels.  In Guildford, gangs of revellers who called themselves guys terrorised the local population.  Proceedings were concerned more with the settling of old arguments and general mayhem, than any historical reminiscences.  Similar problems arose in Exeter, originally the scene of more traditional celebrations.  In 1831 an effigy was burnt of the new Bishop of Exeter Henry Phillpotts, a High Church Anglican and High Tory who opposed Parliamentary reform, and who was also suspected of being involved in creeping popery.  A local ban on fireworks in 1843 was largely ignored, and attempts by the authorities to suppress the celebrations resulted in violent protests and several injured constables.

On several occasions during the 19th century, The Times also reported that the tradition was in decline.  Civil unrest brought about by the union of the Kingdoms of Great Britain and Ireland in 1800 resulted in Parliament passing the Roman Catholic Relief Act 1829, which afforded Catholics greater civil rights, continuing the process of Catholic Emancipation in the two kingdoms.  The traditional denunciations of Catholicism had been in decline since the early 18th century and were thought by many, including Queen Victoria, to be outdated, but the pope’s restoration in 1850 of the English Catholic hierarchy gave renewed significance to November the 5th, as demonstrated by the burnings of effigies of the new Catholic Archbishop of Westminster Nicholas Wiseman, and the pope.  At Farringdon Market 14 effigies were processed from the Strand and over Westminster Bridge to Southwark, while extensive demonstrations were held throughout the suburbs of London.  Effigies of the 12 new English Catholic bishops were paraded through Exeter, already the scene of severe public disorder on each anniversary of the Fifth.  Gradually, however, such scenes became less popular. With little resistance in Parliament, the thanksgiving prayer of November the 5th contained in the Anglican Book of Common Prayer was abolished, and in March 1859 the Anniversary Days Observance Act repealed the Observance of 5th November Act.

As the authorities dealt with the worst excesses, public decorum was gradually restored.  The sale of fireworks was restricted, and the Guildford guys were neutralised in 1865, although this was too late for one constable, who died of his wounds.  Violence continued in Exeter for some years, peaking in 1867 when incensed by rising food prices and banned from firing their customary bonfire, a mob was twice in one night driven from Cathedral Close by armed infantry.  Further riots occurred in 1879, but there were no more bonfires in Cathedral Close after 1894.  Elsewhere, sporadic instances of public disorder persisted late into the 20th century, accompanied by large numbers of firework-related accidents, but a national Firework Code and improved public safety have in most cases brought an end to such things.   

Image © Heather Buckley via Wikipedia

Lewes Bonfire Night in 2010.

Revellers in East Sussex, England. 

Image unknown via Wikipedia and is in the public domain

The Guy Fawkes of 1850.

This commentary on the restoration of the Catholic hierarchy in England, in 1850 is from Punch magazine, November of that year.  The artist is unknown. 

Songs, Guys And Later Developments

One notable aspect of the Victorians’ commemoration of Guy Fawkes Night was its move away from the centres of communities to their margins.  Gathering wood for the bonfire increasingly became the province of working-class children, who solicited combustible materials, money, food and drink from wealthier neighbours, often with the aid of songs.  Most opened with the familiar “Remember, remember, the fifth of November, Gunpowder Treason and Plot”.  The earliest recorded rhyme, from 1742, is reproduced below alongside one bearing similarities to most Guy Fawkes Night ditties, recorded in 1903 at Charlton on Otmoor.

From 1742:

“Don’t you Remember,
The Fifth of November,
‘Twas Gunpowder Treason Day,
I let off my gun,
And made’em all run.
And Stole all their Bonfire away.”

From 1903:

“The fifth of November, since I can remember,
Was Guy Faux, Poke him in the eye,
Shove him up the chimney pot, and there let him die.
A stick and a stake, for King George’s sake,
If you don’t give me one, I’ll take two,
The better for me, and the worse for you,
Ricket-a-racket your hedges shall go.” 

Organised entertainment also became popular in the late 19th century, and 20th-century pyrotechnic manufacturers renamed Guy Fawkes Day as Firework Night.  Sales of fireworks dwindled somewhat during the First World War but resumed in the following peace.  At the start of the Second World War, celebrations were again suspended, resuming in November 1945.  For many families, Bonfire Night became a domestic celebration, and children often congregated on street corners, accompanied by their own effigy of Guy Fawkes.  This was sometimes ornately dressed and sometimes a barely recognisable bundle of rags stuffed with whatever filling was suitable.  A survey found that in 1981 about 23 per cent of Sheffield schoolchildren made Guys, sometimes weeks before the event.  Collecting money was a popular reason for their creation, the children taking their effigy from door to door or displaying it on street corners.  But mainly, they were built to go on the bonfire, itself sometimes comprising wood stolen from other pyres that helped bolster another November tradition, Mischief Night.  Rival gangs competed to see who could build the largest, sometimes even burning the wood collected by their opponents  In 1954 the Yorkshire Post reported on fires late in September, a situation that forced the authorities to remove latent piles of wood for safety reasons.  Lately, however, the custom of a penny for the Guy has almost completely disappeared.  In contrast, some older customs still survive.  In Ottery St. Mary residents run through the streets carrying flaming tar barrels, and since 1679 Lewes has been the setting of some of England’s most extravagant November the 5th celebrations, the Lewes Bonfire.

Generally, modern  November the 5th celebrations are run by local charities and other organisations, with paid admission and controlled access.  In 1998 an editorial in the Catholic Herald called for the end of Bonfire Night, labelling it an offensive act.  Author Martin Kettle, writing in The Guardian in 2003, bemoaned an occasionally nannyish attitude to fireworks that discourages people from holding firework displays in their back gardens, and an unduly sensitive attitude toward the anti-Catholic sentiment once so prominent on Bonfire Night.  David Cressy summarised the modern celebration with these words, “The rockets go higher and burn with more colour, but they have less and less to do with memories of the Fifth of November … it might be observed that Guy Fawkes’ Day is finally declining, having lost its connection with politics and religion.  But we have heard that many times before.”

In 2012 Tom de Castella said,  “It’s probably not a case of Bonfire Night decline, but rather a shift in priorities… there are new trends in the bonfire ritual.  Guy Fawkes masks have proved popular and some of the more quirky bonfire societies have replaced the Guy with effigies of celebrities in the news (including Lance Armstrong and Mario Balotelli) and even politicians.  The emphasis has moved.  The bonfire with a Guy on top (indeed the whole story of the Gunpowder Plot) has been marginalised.  But the spectacle remains. 

Image by Geoff Charles via Wikipedia and is in the public domain

Children from Bontnewydd collecting for the Guy.

This photo by Geoff Charles of children in Caernarfon, Wales was taken in November 1962.   The sign reads Penny for the Guy in Welsh.  

Image © Sam Roberts via Wikipedia

Spectators around a Bonfire at Himley Hall.

This photo was taken by Sam Roberts in Dudley, England. 

In Other Countries 

Gunpowder Treason Day was exported by settlers to colonies around the world, including members of the Commonwealth of Nations such as Australia, New Zealand, Canada, and various Caribbean nations.  In Australia, Sydney (founded as a British penal colony in 1788) saw at least one instance of the parading and burning of a Guy Fawkes effigy in 1805, while in 1833, four years after its founding, Perth listed Gunpowder Treason Day as a public holiday.  By the 1970’s, Bonfire Night had become less common in Australia, with the event simply an occasion to set off fireworks with little connection to Guy Fawkes.  Mostly they were set off annually on a night called cracker night which would include the lighting of bonfires.  Some states had their fireworks night or cracker night at different times of the year, with some being let off on the 5th of November, but most often, they were let off on the Queen’s birthday.  After a range of injuries to children involving fireworks, Fireworks nights and the sale of fireworks were banned in all states except the Australian Capital Territory by the early 1980’s, which saw the end of cracker night.

Some measure of celebration remains in New Zealand, Canada, and South Africa.  On the Cape Flats in Cape Town, South Africa, Guy Fawkes Day has become associated with youth hooliganism.  In Canada in the 21st century, celebrations of Bonfire Night on November the 5th are largely confined to the province of Newfoundland and Labrador.  The day is still marked in Saint Vincent and the Grenadines, and in Saint Kitts and Nevis, but a fireworks ban by Antigua and Barbuda during the 1990’s reduced its popularity in that country.

In North America, the commemoration was at first paid scant attention, but the arrest of two boys caught lighting bonfires on the 5th of November 1662 in Boston suggests, in historian James Sharpe’s view, that an underground tradition of commemorating the Fifth existed.  In parts of North America, it was known as Pope Night, celebrated mainly in colonial New England, but also as far south as Charleston.  In Boston, founded in 1630 by Puritan settlers, an early celebration was held in 1685, the same year that James II assumed the throne.  Fifty years later, again in Boston, a local minister wrote about a great number of people going to Dorchester where at night they made a Great Bonfire and plaid off many fireworks.  The day ended in tragedy when four young men coming home in a Canoe were all Drowned.  Ten years later the raucous celebrations were the cause of considerable annoyance to the upper classes and a special Riot Act was passed, to prevent riotous tumultuous and disorderly assemblies of more than three persons, all or any of them armed with Sticks, Clubs or any kind of weapons, or disguised with vizards, or painted or discoloured faces, or in any manner disguised, having any kind of imagery or pageantry, in any street, lane, or place in Boston.  With inadequate resources, however, Boston’s authorities were powerless to enforce the Act.  In the 1740’s gang violence became common, with groups of Boston residents battling for the honour of burning the pope’s effigy.  But by the mid-1760’s these riots had subsided, and as colonial America moved towards revolution, the class rivalries featured during Pope Day gave way to anti-British sentiment.  Author Alfred Young said Pope Day provided the scaffolding, symbolism, and leadership for resistance to the Stamp Act in 1764–65, forgoing previous gang rivalries in favour of a unified resistance to Britain.

The passage in 1774 of the Quebec Act, which guaranteed French Canadians free practice of Catholicism in the Province of Quebec, provoked complaints from some Americans that the British were introducing Popish principles and French law.  Such fears were bolstered by opposition from the Church in Europe to American independence, threatening a revival of Pope Day.  

The tradition continued in Salem as late as 1817, and was still observed in Portsmouth, New Hampshire, in 1892.  In the late 18th century, effigies of prominent figures such as two Prime Ministers of Great Britain, the Earl of Bute and Lord North, and the American traitor General Benedict Arnold, were also burnt.  In the 1880’s bonfires were still being lit in some New England coastal towns, although no longer to commemorate the failure of the Gunpowder Plot.  In the area around New York City, stacks of barrels were burnt on Election Day eve, which after 1845 was a Tuesday early in November. 

See Also 

You can see references and sources to the above articles here.  The above was sourced from a page on Wikipedia and is subject to change.  

Blog Posts

Links

The image above of Guy Fawkes on Bonfire Night, 2016 is copyright of Frank Parker.

The image above of the festivities in Windsor Castle during Guy Fawkes night in 1776 is copyright of Wikipedia user William WarbyIt comes with a Creative Commons licence (CC BY 2.0).  

A Guy Fawkes mask from Whoopee! dated 28/10/1978 is by artist Brian Walker.  It comes from the website Great News For All Readers!

The image above of Lewes Bonfire Night in 2010 is copyright of Wikipedia user Heather Buckley.  It comes with a Creative Commons licence (CC BY 2.0).  

The image above of the Guy Fawkes of 1850 is by artist unknown.  It is in the Public Domain.

The image above of Children from Bontnewydd collecting for the Guy is by Geoff Charles.  It is in the Public Domain.

The image above of spectators around a Bonfire at Himley Hall is by Sam Roberts.  It comes with a Creative Commons licence (CC BY 2.0).  

Great News For All Readers! – Official website.  This website is from a collector of comics published in Britain in the 1970’s and 1980’s.  It shows his memories of being a reader of these comics as a child, his observations as a collector today and an attempt to catalogue the comics from a fan’s perspective. 

Great News For All Readers! on Facebook.

Great News For All Readers! on Twitter.

Creative Commons – Official website.  They offer better sharing, advancing universal access to knowledge and culture, and fostering creativity, innovation, and collaboration.   

Holidays

Image © Pexels via Pexels

The main holidays I celebrated growing up through the decades, and still do, were New Year, Easter, Bonfire Night and Christmas.  Celebrating Halloween came much later in my adult years.  All these contain happy memories with my family, kids and grandkids.

These holidays carry their traditions and traditions meant a lot to my Mom and they mean a lot to me because as long as I carry on doing the things she did, and my own, they will never die out in a world where such things don’t seem to matter anymore to a lot of people.  The traditions that Mom loved, and the ones we did together, forever bring a smile to my face and happy memories and as long as I can do them I will and keep them alive, not just for me but for my grandkids and Mom too because I know she is here in spirit to enjoy them too.    

About Holidays

A holiday is a day or other period set aside for festivals or recreation.  They appear at various times during the four seasons.  Public holidays are set by public authorities and vary by state or region.  Religious holidays are set by religious organisations for their members and are often also observed as public holidays in religious-majority countries.  Some religious holidays, such as Christmas, have become secularised by part or all of those who observe them.  In addition to secularisation, many holidays have become commercialised due to the growth of industry.

Holidays can be thematic, celebrating or commemorating particular groups, events, or ideas, or non-thematic, days of rest that do not have any particular meaning.  In Commonwealth English, the term can refer to any period of rest from work, (a.k.a. vacations) or school holidays. Holidays typically refer to the period from Thanksgiving (in the United States, Canada, Grenada, Saint Lucia, Liberia, and unofficially in countries like Brazil, Germany and the Philippines.  It is also observed in the Dutch town of Leiden and the Australian territory of Norfolk Island) to New Year’s. 

If there is a celebration of some sort you will usually see lots of colourful fireworks.  

Image © Pexels via Pexels

A great display of blue fireworks.

New Year

You can read about New Year here.

Easter

You can read about Easter here.

Halloween

You can read about Halloween here.

Bonfire Night

You can read about Bonfire Night here.

Christmas

You can read about Christmas here.

Terminology

The word holiday comes from the Old English word hāligdæg (hālig “holy” + dæg “day”).  The word originally referred only to special religious days.

The word holiday has differing connotations in different regions.  In the United Kingdom and other Commonwealth nations, the word may refer to the period where leave from one’s duties has been agreed upon.  This time is usually set aside for rest, travel, or participation in recreational activities, with entire industries targeted to coincide with or enhance these experiences. The days of leave may not coincide with any specific customs or laws. Employers and educational institutes may designate holidays themselves, which may or may not overlap nationally or culturally relevant dates, which again comes under this connotation, but it is the first implication detailed that this article is concerned with.  Modern use varies geographically.  In the United States, the word is used exclusively to refer to the nationally, religiously, or culturally observed day(s) of rest or celebration or the events themselves and is known as a vacation.  In North America, it means any dedicated day or period of celebration.   

Global Holidays 

The celebration of the New Year has been a common holiday across cultures for at least four millennia.  Such holidays normally celebrate the last day of the year and the arrival of the next year in a calendar system.  In modern cultures using the Gregorian calendar, the New Year’s celebration spans New Year’s Eve on the 31st of December and New Year’s Day on the 1st of January.  However, other calendar systems also have New Year’s celebrations, such as Chinese New Year and Vietnamese Tet.  New Year’s Day is the most common public holiday, observed by all countries using the Gregorian calendar except Israel.

Christmas is a popular holiday globally due to the spread of Christianity.  The holiday is recognised as a public holiday in many countries in Europe, the Americas, Africa and Australasia and is celebrated by over 2 billion people.  Although a holiday with religious origins, Christmas is often celebrated by non-Christians as a secular holiday.  For example, 61% of British people celebrate Christmas in an entirely secular way.  Christmas has also become a tradition in some non-Christian countries.  For example, for many Japanese people, it has become customary to buy and eat fried chicken on Christmas.  

Public Holidays 

Read more about Public Holidays here.  

Substitute Holidays 

If a holiday coincides with another holiday or a weekend day a substitute holiday may be recognised in lieu.  In the United Kingdom, the government website states that “If a bank holiday is on a weekend, a substitute weekday becomes a bank holiday, normally the following Monday.”  The process of moving a holiday from a weekend day to the following Monday is known as Mondayisation in New Zealand.  

Religious Holidays 

Many holidays are linked to faiths and religions (see etymology above).  Christian holidays are defined as part of the liturgical year, the chief ones being Easter and Christmas.  The Orthodox Christian and Western-Roman Catholic patronal feast day or name day is celebrated on each place’s patron saint’s day, according to the Calendar of Saints.  Jehovah’s Witnesses annually commemorate The Memorial of Jesus Christ’s Death but do not celebrate other holidays with any religious significance such as Easter, Christmas or New Year.  This holds especially true for those holidays that have combined and absorbed rituals, overtones or practices from non-Christian beliefs into the celebration, as well as those holidays that distract from or replace the worship of Jehovah.  In Islam, the largest holidays are Eid al-Fitr (immediately after Ramadan) and Eid al-Adha (at the end of the Hajj).  Ahmadi Muslims additionally celebrate Promised Messiah Day, Promised Reformer Day, and Khilafat Day, but contrary to popular belief, neither are regarded as holidays.  Hindus, Jains and Sikhs observe several holidays, one of the largest being Diwali (Festival of Light). Japanese holidays as well as a few Catholic holidays contain heavy references to several different faiths and beliefs.  Celtic, Norse, and Neopagan holidays follow the order of the Wheel of the Year.  For example, Christmas ideas like decorating trees and colours (green, red, and white) have very similar ideas to modern Wicca (a modern Pagan belief) Yule which is a lesser Sabbat of the wheel of the year.  Some are closely linked to Swedish festivities.  The Bahaʼí Faith observes 11 annual holidays on dates determined using the Bahaʼí calendar.  Jews have two holiday seasons, the Spring Feasts of Pesach (Passover) and Shavuot (Weeks, called Pentecost in Greek) and the Fall Feasts of Rosh Hashanah (Head of the Year), Yom Kippur (Day of Atonement), Sukkot (Tabernacles), and Shemini Atzeret (Eighth Day of Assembly). 

See Also

You can see references and sources to the above articles here.  The above was sourced from a page on Wikipedia and is subject to change.  

Blog Posts

Links

PexelsThe image shown at the top of this page is the copyright of Pexels  You can find more free stock photos on there.

Television

Image © of Max Rahubovskiy via Pexels

Most of us have grown up watching a television screen of some sort.  For me, television was at its best in the 1970’s and 1980’s when it was proper family entertainment. 

I don’t watch much telly these days (and I certainly DO NOT watch the bullshit so-called news).  Like films, it has all become too woke for my liking.  What was once entertainment has become a form of brainwashing and lecturing and I don’t watch it live anymore. I don’t turn on my television much unless it is to watch a DVD via my DVD player, watch YouTube, or Amazon Prime, or watch something decent that fits in with my likes via my Amazon Fire TV stick 4K Max.  

I have plenty of favourite television programs over the decades as a child and older, but watching them with family in my favourite decade, the 70’s, will always hold the most special memories for me. 

I like most TV genres with my favourite being Horror and Science Fiction ones.  I have favourite actors and actresses the same as anyone else does and they will be shown on this page.  I am not going to list every telly programme I have watched in my lifetime, that would be IMPOSSIBLE to remember but I will list programmes I have watched and enjoyed that I think are worth watching for someone else but of course, your opinions may differ from mine, that’s life.  

About Televison

Television (TV), also referred to as telly, is a telecommunication medium for transmitting moving images and sound.  The term can refer to a TV set or the medium of TV transmission.  Television is a mass medium for advertising, entertainment, news, and sports.

Television became available in crude experimental forms in the late 1920’s, but only after several years of further development was the new technology marketed to consumers.  After World War II, an improved form of black-and-white TV broadcasting became popular in the United Kingdom (U.K.) and the United States (U.S.), and TV sets became commonplace in homes, businesses, and institutions.  During the 1950’s, telly was the primary medium for influencing public opinion.  In the mid-1960’s, colour broadcasting was introduced in the U.S. and most other developed countries.

The availability of various types of archival storage media such as Betamax and Video Home System (VHS) tapes, Laser Discs, high-capacity hard disk drives, Compact Discs (CD’s), Digital Versatile Discs (DVD’s, flash drives, high-definition (HD) DVD’s and Blu-ray Discs, and cloud digital video recorders has enabled viewers to watch pre-recorded material, such as movies, at home on their own time schedule.  For many reasons, especially the convenience of remote retrieval, the storage of television and video programming now also occurs on the cloud (such as the video-on-demand service by Netflix).  At the end of the first decade of the 2000’s, digital television transmissions greatly increased in popularity.  Another development was the move from standard-definition TV (SDTV) (576i, with 576 interlaced lines of resolution and 480i) to high-definition TV (HDTV), which provides a resolution that is substantially higher.  HDTV may be transmitted in different formats (1080p, 1080i and 720p).  Since 2010, with the invention of smart television, Internet television has increased the availability of television programs and movies via the Internet through streaming video services such as Netflix, Amazon Prime Video, and Hulu.

In 2013, 79% of the world’s households owned a television set.  The replacement of earlier cathode-ray tube (CRT) screen displays with compact, energy-efficient, flat-panel alternative technologies such as liquid-crystal display (LCD) both fluorescent backlit and light-emitting diode (LED), organic light-emitting diode (OLED) and plasma displays was a hardware revolution that began with computer monitors in the late 1990’s.  Most television sets sold in the 2000’s were flat-panel, mainly LED’s.  Major manufacturers announced the discontinuation of CRT, Digital Light Processing (DLP), plasma, and even fluorescent-backlit LCD TV’s by the mid-2010’s.  In the near future, LED’s are expected to be gradually replaced by OLED TV’s.  Also, major manufacturers have announced that they will increasingly produce smart TV’s in the mid-2010’s.  Smart TVs with integrated Internet and Web 2.0 functions became the dominant form of television by the late 2010’s.

Television signals were initially distributed only as terrestrial television using high-powered radio-frequency television transmitters to broadcast the signal to individual television receivers.  Alternatively, television signals are distributed by coaxial cable or optical fibre, satellite systems and, since the 2000’s via the Internet.  Until the early 2000’s, these were transmitted as analogue signals, but a transition to digital television was expected to be completed worldwide by the late 2010’s.  A standard television set consists of multiple internal electronic circuits, including a tuner for receiving and decoding broadcast signals.  A visual display device that lacks a tuner is correctly called a video monitor rather than a television.

Image © Wags05 via Wikipedia

Flat-screen televisions for sale at a consumer electronics store in 2008.

Etymology

The word television comes from the Ancient Greek τῆλε (tele) meaning far, and Latin visio meaning sight.  The first documented usage of the term dates back to 1900, when the Russian scientist Constantin Perskyi used it in a paper that he presented in French at the first International Congress of Electricity, which ran from the 18th to the 25th of August 1900 during the International World Fair in Paris.

The anglicised version of the term was first attested in 1907 when it was classed as a theoretical system to transmit moving images over telegraph or telephone wires.  It was formed in English or borrowed from the French word télévision.  In the 19th century and early 20th century, other proposals for the name of a then-hypothetical technology for sending pictures over distance were telephote (1880) and televista (1904).

The abbreviation TV is from 1948.  The use of the term to mean a television set dates from 1941.  The use of the term to mean television as a medium dates from 1927.

The term telly is more common in the United Kingdom (U.K).  The slang term the tube or the boob tube derives from the bulky cathode-ray tube used on most TV’s until the advent of flat-screen tellies.  

The History Of Television

Mechanical Television

Read more about Mechanical Television here.

Facsimile transmission systems (FAX) for still photographs pioneered methods of mechanical scanning of images in the early 19th century.  Alexander Bain introduced the facsimile machine between 1843 and 1846.  Frederick Bakewell demonstrated a working laboratory version in 1851.  Willoughby Smith discovered the photoconductivity of the element selenium in 1873.  As a 23-year-old German university student, Paul Julius Gottlieb Nipkow proposed and patented the Nipkow disk in 1884 in Berlin.  This was a spinning disk with a spiral pattern of holes in it, so each hole scanned a line of the image.  Although he never built a working model of the system, variations of Nipkow’s spinning disk image rasteriser became exceedingly common.  Constantin Perskyi coined the word television (TV) in a paper read to the International Electricity Congress at the International World Fair in Paris on the 24th of August, 1900.  Perskyi’s paper reviewed the existing electromechanical technologies, mentioning the work of Nipkow and others.  However, it was not until 1907 that developments in amplification tube technology by Lee de Forest and Arthur Korn, among others, made the design practical.

The first demonstration of the live transmission of images was by Georges Rignoux and A. Fournier in Paris in 1909.  A matrix of 64 selenium cells, individually wired to a mechanical commutator, served as an electronic retina.  In the receiver, a type of Kerr cell modulated the light and a series of differently angled mirrors attached to the edge of a rotating disc scanned the modulated beam onto the display screen.  A separate circuit regulated synchronisation.  The 8×8 pixel resolution in this proof-of-concept demonstration was just sufficient to clearly transmit individual letters of the alphabet.  An updated image was transmitted several times each second.

In 1911, Boris Rosing and his student Vladimir Zworykin created a system that used a mechanical mirror-drum scanner to transmit, in Zworykin’s words, “very crude images” over wires to the Braun tube (cathode-ray tube) in the receiver.  Moving images was not possible because in the scanner the sensitivity was not enough and the selenium cell was very laggy.

In 1921, Edouard Belin sent the first image via radio waves with his belinograph.

By the 1920’s, when amplification made TV practical, Scottish inventor John Logie Baird employed the Nipkow disk in his prototype video systems.  On the 25th of March, 1925, Baird gave the first public demonstration of televised silhouette images in motion, at Selfridges’s department store in London.  Since human faces had inadequate contrast to show up in his primitive system, he televised a ventriloquist’s dummy named Stooky Bill, whose painted face had higher contrast, talking and moving.  By the 26th of January, 1926, he had demonstrated before members of the Royal Institution the transmission of an image of a face in motion by radio.  This is widely regarded as the world’s first true public TV demonstration, exhibiting light, shade and detail.  Baird’s system used the Nipkow disk for both scanning the image and displaying it.  A brightly illuminated subject was placed in front of a spinning Nipkow disk set with lenses which swept images across a static photocell.  The thallium sulphide (Thalofide) cell, developed by Theodore Case in the United States (U.S.), detected the light reflected from the subject and converted it into a proportional electrical signal.  This was transmitted by Amplitude Modulation (AM) radio waves to a receiver unit, where the video signal was applied to a neon light behind a second Nipkow disk rotating synchronised with the first.  The brightness of the neon lamp was varied in proportion to the brightness of each spot on the image.  As each hole in the disk passed by, one scan line of the image was reproduced.  Baird’s disk had 30 holes, producing an image with only 30 scan lines, just enough to recognize a human face.  In 1927, Baird transmitted a signal over 438 miles (705 km) of telephone line between London and Glasgow.  Baird’s original televisor now resides in the Science Museum, South Kensington.

In 1928, Baird’s company (Baird Television Development Company/Cinema Television) broadcast the first transatlantic TV signal, between London and New York, and the first shore-to-ship transmission.  In 1929, he became involved in the first experimental mechanical TV service in Germany.  In November of the same year, Baird and Bernard Natan of Pathe established France’s first television company, Television-Baird-Natan.  In 1931, he made the first outdoor remote broadcast, of The Derby.  In 1932, he demonstrated ultra-short-wave (USW) television.  Baird’s mechanical system reached a peak of 240 lines of resolution on the British Broadcasting Company’s (BBC) telecasts in 1936, though the mechanical system did not scan the televised scene directly.  Instead, a 17.5 mm film was shot, rapidly developed and then scanned while the film was still wet.

A U.S. inventor, Charles Francis Jenkins, also pioneered the television.  He published an article on Motion Pictures by Wireless in 1913 and transmitted moving silhouette images for witnesses in December 1923.  On the 13th of June, 1925, he publicly demonstrated the synchronised transmission of silhouette pictures.  In 1925 Jenkins used the Nipkow disk and transmitted the silhouette image of a toy windmill in motion, over a distance of 5 miles (8 km), from a naval radio station in Maryland to his laboratory in Washington, D.C., using a lensed disk scanner with a 48-line resolution.  He was granted U.S. Patent No. 1,544,156 (Transmitting Pictures over Wireless) on the 30th of June, 1925 and filed it on the 13th of March, 1922.

Herbert E. Ives and Frank Gray of Bell Telephone Laboratories gave a dramatic demonstration of mechanical television on the 7th of April, 1927.  Their reflected-light television system included both small and large viewing screens.  The small receiver had a 2-inch-wide by 2.5-inch-high screen (5 by 6 cm).  The large receiver had a screen 24 inches wide by 30 inches high (60 by 75 cm).  Both sets could reproduce reasonably accurate, monochromatic, moving images.  Along with the pictures, the sets received synchronised sound.  The system transmitted images over two paths.  The first was a copper wire link from Washington to New York City, then a radio link from Whippany, New Jersey.  Comparing the two transmission methods, viewers noted no difference in quality.  Subjects of the telecast included Secretary of Commerce Herbert Hoover.  A flying-spot scanner beam illuminated these subjects.  The scanner that produced the beam had a 50-aperture disk.  The disc revolved at a rate of 18 frames per second, capturing one frame about every 56 milliseconds (today’s systems typically transmit 30 or 60 frames per second, or one frame every 33.3 or 16.7 milliseconds respectively).  Telly historian Albert Abramson underscored the significance of the Bell Labs demonstration and said, “It was in fact the best demonstration of a mechanical television system ever made to this time. It would be several years before any other system could even begin to compare with it in picture quality.”

In 1928, WRGB, then W2XB, was started as the world’s first TV station.  It was broadcast from the General Electric (GE) facility in Schenectady, N.Y.  It was popularly known as WGY Television.  Meanwhile, in the Soviet Union, Leon Theremin had been developing a mirror drum-based television, starting with 16 lines resolution in 1925, then 32 lines and eventually 64 using interlacing in 1926.  As part of his thesis, on the 7th of May, 1926, he electrically transmitted, and then projected, near-simultaneous moving images on a 5-square-foot (0.46 m2) screen.

By 1927 Theremin had achieved an image of 100 lines, a resolution that was not surpassed until May 1932 by Radio Corporation of America (RCA), with 120 lines.

On Christmas Day in 1926, Kenjiro Takayanagi demonstrated a television system with a 40-line resolution that employed a Nipkow disk scanner and cathode ray tubes (CRT) display at Hamamatsu Industrial High School in Japan.  This prototype is still on display at the Takayanagi Memorial Museum at Shizuoka University, Hamamatsu Campus.  His research in creating a production model was halted by the SCAP after World War II.

Because only a limited number of holes could be made in the disks, and disks beyond a certain diameter became impractical, image resolution on mechanical television broadcasts was relatively low, ranging from about 30 lines up to 120 or so.  Nevertheless, the image quality of 30-line transmissions steadily improved with technical advances, and by 1933 the United Kingdom (U.K.) broadcasts using the Baird system were remarkably clear.  A few systems ranging into the 200-line region also went on the air. Two of these were the 180-line system that Compagnie des Compteurs installed in Paris in 1935, and the 180-line system that Peck Television Corp. started in 1935 at station VE9AK in Montreal.  The advancement of all-electronic television (including image dissectors and other camera tubes and CRT’s for the reproducer) marked the start of the end for mechanical systems as the dominant form of television.  Mechanical TV, despite its inferior image quality and generally smaller picture, would remain the primary television technology until the 1930’s.  The last mechanical telecasts ended in 1939 at stations run by a lot of public universities in the U.S.

Image © of Hzeller via Wikipedia
Image © of Orrin Dunlap, Jnr.

John Logie Baird in 1925 with his televisor equipment and dummies James (on the left) and Stooky Bill (on the right). 

The above image is on page 650 of Popular Radio magazine, Vol. 10, No. 7, dated November 1926. It was published by Popular Radio, Inc. in New York, U.S.A.  You can download a copy of this magazine via World Radio History by clicking here.

Electronic Television 

Read more about Electronic Television here.

In 1897, English physicist J. J. Thomson was able, in his three well-known experiments, to deflect cathode rays, a fundamental function of the modern cathode-ray tube. The earliest version of the cathode ray tube (CRT) was invented by the German physicist Ferdinand Braun in 1897 and is also known as the Braun tube.  It was a cold-cathode diode, a modification of the Crookes tube, with a phosphor-coated screen.  Braun was the first to conceive the use of a CRT as a display device.  The Braun tube became the foundation of 20th-century television.  In 1906 the Germans Max Dieckmann and Gustav Glage produced raster images for the first time in a CRT.  In 1907, Russian scientist Boris Rosing used a CRT in the receiving end of an experimental video signal to form a picture.  He managed to display simple geometric shapes on the screen.

In 1908, Alan Archibald Campbell-Swinton, fellow of the Royal Society, published a letter in the scientific journal Nature in which he described how distant electric vision could be achieved by using a cathode-ray tube, or Braun tube, as both a transmitting and receiving device, he expanded on his vision in a speech given in London in 1911 and reported in The Times and the Journal of the Rontgen Society in another letter to Nature published in October 1926.  Campbell-Swinton also announced the results of some not-very-successful experiments he had conducted with G. M. Minchin and J. C. M. Stanton.  They attempted to generate an electrical signal by projecting an image onto a selenium-coated metal plate that was simultaneously scanned by a cathode ray beam.  These experiments were conducted before March 1914, when Minchin died, but they were later repeated by two different teams in 1937, by H. Miller and J. W. Strange from Electric and Musical Industries Ltd. (EMI), and by H. Iams and A. Rose from Radio Corporation of America (RCA).  Both teams succeeded in transmitting very faint images with the original Campbell-Swinton’s selenium-coated plate.  Although others had experimented with using a cathode-ray tube as a receiver, the concept of using one as a transmitter was novel.  The first cathode-ray tube to use a hot cathode was developed by John B. Johnson (who gave his name to the term Johnson noise) and Harry Weiner Weinhart of Western Electric and became a commercial product in 1922.

In 1926, Hungarian engineer Kalman Tihanyi designed a television (TV) system using fully electronic scanning and display elements and employing the principle of charge storage within the scanning (or camera) tube.  The problem of low sensitivity to light resulting in low electrical output from transmitting (or camera) tubes would be solved with the introduction of charge-storage technology by Kalman Tihanyi beginning in 1924.  His solution was a camera tube that accumulated and stored electrical charges (photoelectrons) within the tube throughout each scanning cycle.  The device was first described in a patent application he filed in Hungary in March 1926 for a television system he called Radioskop.  After further refinements included in a 1928 patent application, Tihanyi’s patent was declared void in Great Britain in 1930, so he applied for patents in the United States (U.S.).  Although his breakthrough would be incorporated into RCA’s iconoscope design in 1931, the U.S. patent for Tihanyi’s transmitting tube would not be granted until May 1939.  The patent for his receiving tube had been granted the previous October.  Both patents had been purchased by RCA prior to their approval.  Charge storage remains a basic principle in the design of imaging devices for television to the present day.  On Christmas Day, 1926, at Hamamatsu Industrial High School in Japan, Japanese inventor Kenjiro Takayanagi demonstrated a TV system with a 40-line resolution that employed a CRT display.  This was the first working example of a fully electronic television receiver and Takayanagi’s team later made improvements to this system parallel to other TV developments.  Takayanagi did not apply for a patent.

In the 1930’s, Allen B. DuMont made the first CRT to last 1,000 hours of use, which was one of the factors that led to the widespread adoption of TV.

On the 7th of September 1927, U.S. inventor Philo Farnsworth’s image dissector camera tube transmitted its first image, a simple straight line, at his laboratory at 202 Green Street in San Francisco.  By the 3rd of September 1928, Farnsworth had developed the system sufficiently to hold a demonstration for the press.  This is widely regarded as the first electronic television demonstration.  In 1929, the system was improved further by the elimination of a motor generator, so that his television system now had no mechanical parts.  That year, Farnsworth transmitted the first live human images with his system, including a three-and-a-half-inch image of his wife Elma (nicknamed Pem) with her eyes closed (possibly due to the bright lighting required).

Meanwhile, Vladimir Zworykin was also experimenting with the cathode-ray tube to create and show images.  While working for Westinghouse Electric in 1923, he began to develop an electronic camera tube.  But in a 1925 demonstration, the image was dim, had low contrast, and poor definition, and was stationary.  Zworykin’s imaging tube never got beyond the laboratory stage but RCA, which acquired the Westinghouse patent, asserted that the patent for Farnsworth’s 1927 image dissector was written so broadly that it would exclude any other electronic imaging device.  Thus RCA, on the basis of Zworykin’s 1923 patent application, filed a patent interference suit against Farnsworth. The U.S. Patent Office examiner disagreed in a 1935 decision, finding priority of invention for Farnsworth against Zworykin.  Farnsworth claimed that Zworykin’s 1923 system could not produce an electrical image of the type to challenge his patent.  Zworykin received a patent in 1928 for a colour transmission version of his 1923 patent application.  He also divided his original application in 1931.  Zworykin was unable or unwilling to introduce evidence of a working model of his tube that was based on his 1923 patent application. In September 1939, after losing an appeal in the courts, and being determined to go forward with the commercial manufacturing of television equipment, RCA agreed to pay Farnsworth US$1 million over a ten-year period, in addition to license payments, to use his patents.

In 1933, RCA introduced an improved camera tube that relied on Tihanyi’s charge storage principle.  Called the Iconoscope by Zworykin, the new tube had a light sensitivity of about 75,000 lux and thus was claimed to be much more sensitive than Farnsworth’s image dissector.  However, Farnsworth had overcome his power issues with his Image Dissector through the invention of a completely unique multipactor device that he began work on in 1930, and demonstrated in 1931.  This small tube could amplify a signal reportedly to the 60th power or better and showed great promise in all fields of electronics.  Unfortunately, an issue with the multipactor was that it wore out at an unsatisfactory rate.

At the Berlin Radio Show in August 1931 in Berlin, Manfred von Ardenne gave a public demonstration of a television system using a CRT for both transmission and reception, the first completely electronic television transmission.  However, Ardenne had not developed a camera tube, using the CRT instead as a flying-spot scanner to scan slides and film.  Ardenne achieved his first transmission of TV pictures on Christmas Eve, 1933, followed by test runs for a public television service in 1934.  The world’s first electronically scanned TV service started in Berlin in 1935, the Fernsehsender Paul Nipkow, culminating in the live broadcast of the 1936 Summer Olympic Games from Berlin to public places all over Germany.

Philo Farnsworth gave the world’s first public demonstration of an all-electronic TV system, using a live camera, at the Franklin Institute of Philadelphia on the 25th of August 1934, and for ten days afterwards.  Mexican inventor Guillermo Gonzalez Camarena also played an important role in early telly.  His experiments with TV (known as telectroescopía at first) began in 1931 and led to a patent for the trichromatic field sequential system colour TV in 1940.  In Britain, the EMI engineering team led by Isaac Shoenberg applied in 1932 for a patent for a new device they called the Emitron, which formed the heart of the cameras they designed for the British Broadcasting Company (BBC).   On the 2nd of November 1936, a 405-line broadcasting service employing the Emitron began at studios in Alexandra Palace, and transmitted from a specially built mast atop one of the Victorian building’s towers.  It alternated for a short time with Baird’s mechanical system in adjoining studios but was more reliable and visibly superior.  This was the world’s first regular high-definition television (HDTV) service. 

The original U.S. iconoscope was noisy, had a high ratio of interference to signal, and ultimately gave disappointing results, especially when compared to the high-definition (HD) mechanical scanning systems that became available.  The Electric and Musical Industries Ltd. (EMI) team, under the supervision of Isaac Shoenberg, analysed how the iconoscope (or Emitron) produces an electronic signal and concluded that its real efficiency was only about 5% of the theoretical maximum.  They solved this problem by developing, and patenting in 1934, two new camera tubes dubbed super-Emitron and CPS Emitron.  The super-Emitron was between ten and fifteen times more sensitive than the original Emitron and iconoscope tubes and, in some cases, this ratio was considerably greater.  It was used for outside broadcasting by the British Broadcasting Company (BBC), for the first time, on Armistice Day 1937, when the general public could watch on a TV set as the King laid a wreath at the Cenotaph.  This was the first time that anyone had broadcast a live street scene from cameras installed on the roof of neighbouring buildings because neither Farnsworth nor R.C.A. would do the same until the 1939 New York World’s Fair.

On the other hand, in 1934, Zworykin shared some patent rights with the German licensee company Telefunken.  The image iconoscope (Superikonoskop in Germany) was produced as a result of the collaboration.  This tube is essentially identical to the super-Emitron.  The production and commercialisation of the super-Emitron and image iconoscope in Europe were not affected by the patent war between Zworykin and Farnsworth, because Dieckmann and Hell had priority in Germany for the invention of the image dissector, having submitted a patent application for their Lichtelektrische Bildzerlegerrohre fur Fernseher (Photoelectric Image Dissector Tube for Television) in Germany in 1925, two years before Farnsworth did the same in the United States.  The image iconoscope (Superikonoskop) became the industrial standard for public broadcasting in Europe from 1936 until 1960 when it was replaced by the vidicon and plumbicon tubes.  Indeed, it was the representative of the European tradition in electronic tubes competing against the American tradition represented by the image orthicon.  The German company Heimann produced the Superikonoskop for the 1936 Berlin Olympic Games, later Heimann also produced and commercialised it from 1940 to 1955.  From 1952 to 1958 the Dutch company Philips finally produced and commercialised the image iconoscope and multicon.

U.S. television broadcasting, at the time, consisted of a variety of markets in a wide range of sizes, each competing for programming and dominance with separate technology, until deals were made and standards agreed upon in 1941.  RCA, for example, used only Iconoscopes in the New York area, but Farnsworth Image Dissectors in Philadelphia and San Francisco.  In September 1939, RCA agreed to pay the Farnsworth Television and Radio Corporation royalties over the next ten years for access to Farnsworth’s patents.  With this historic agreement in place, RCA integrated much of what was best about Farnsworth Technology into their systems.  In 1941, the United States implemented 525-line television.  Electrical engineer Benjamin Adler played a prominent role in the development of television.

The world’s first 625-line TV standard was designed in the Soviet Union in 1944 and became a national standard in 1946.  The first broadcast in 625-line standard occurred in Moscow in 1948.  The concept of 625 lines per frame was subsequently implemented in the European CCIR standard.  In 1936, Kalman Tihanyi described the principle of plasma display, the first flat panel display system.

Early electronic TV sets were large and bulky, with analogue circuits made of vacuum tubes.  Following the invention of the first working transistor at Bell Labs, Sony founder Masaru Ibuka predicted in 1952 that the transition to electronic circuits made of transistors would lead to smaller and more portable TV sets.  The first fully transistorised, portable solid-state television set was the 8-inch Sony TV8-301, developed in 1959 and released in 1960.  This began the transformation of TV viewership from a communal viewing experience to a solitary viewing experience.  By 1960, Sony had sold over 4 million portable TV sets worldwide.

Image © unknown via Wikipedia and is in the public domain

Ferdinand Braun.

Image © unknown via Wikipedia and is in the public domain

Vladimir Zworykin in 1929.

The Westinghouse Electric and Manufacturing Company research engineer can be seen here with Mildred Birt demonstrating electronic television.

The broadcast images are projected on a mirror on the top of the cabinet making it possible for many to watch.

Image © unknown via Wikipedia

Manfred von Ardenne in 1933. 

Image © unknown via Wikipedia and is in the public domain

A Radio Corporation Of America Advertisement.

This RCA advertisement from the Radio & Television magazine (Vol. X, No. 2, June, 1939) is for the beginning of regular experimental television broadcasting from the NBC studios to the New York metropolitan area, U.S.A.

Image © unknown via Wikipedia and is in the public domain

An Indian-head test pattern.

This 2F21 monoscope tube motif was used from 1940 until the advent of colour television.  It was displayed when a television station first signed on every day.

Colour Television 

Read more about Colour Television here

The basic idea of using three monochrome images to produce a colour image had been experimented with almost as soon as black-and-white televisions (TV) had first been built. Although he gave no practical details, among the earliest published proposals for TV was one by Maurice Le Blanc, in 1880, for a colour system, including the first mentions in TV literature of line and frame scanning.  Polish inventor Jan Szczepanik patented a colour TV system in 1897, using a selenium photoelectric cell at the transmitter and an electromagnet controlling an oscillating mirror and a moving prism at the receiver.  But his system contained no means of analyzing the spectrum of colours at the transmitting end, and could not have worked as he described it.  Another inventor, Hovannes Adamian, also experimented with colour television as early as 1907.  The first colour TV project was claimed by him, and was patented in Germany on the 31st of March, 1908, patent No. 197183, then in Britain, on the 1st of April 1908, patent No. 7219, in France (patent No. 390326) and in Russia in 1910 (patent No. 17912).

Scottish inventor John Logie Baird demonstrated the world’s first colour transmission on the 3rd of July, 1928, using scanning discs at the transmitting and receiving ends with three spirals of apertures, each spiral with filters of a different primary colour and three light sources at the receiving end, with a commutator to alternate their illumination.  Baird also made the world’s first colour broadcast on the 4th of February, 1938, sending a mechanically scanned 120-line image from Baird’s Crystal Palace studios to a projection screen at London’s Dominion Theatre.  Mechanically scanned colour television was also demonstrated by Bell Laboratories in June 1929 using three complete systems of photoelectric cells, amplifiers, glow-tubes, and colour filters, with a series of mirrors to superimpose the red, green, and blue images into one full-colour image.

The first practical hybrid system was again pioneered by John Logie Baird.  In 1940 he publicly demonstrated a colour TV combining a traditional black-and-white display with a rotating coloured disk.  This device was very deep, but was later improved with a mirror folding the light path into an entirely practical device resembling a large conventional console.  However, Baird was unhappy with the design, and, as early as 1944, had commented to a British government committee that a fully electronic device would be better.

In 1939, Hungarian engineer Peter Carl Goldmark introduced an electro-mechanical system while at CBS Broadcasting Inc. (CBS), which contained an Iconoscope sensor.  The CBS field-sequential colour system was partly mechanical, with a disc made of red, blue, and green filters spinning inside the television camera at 1,200 rpm, and a similar disc spinning in synchronisation in front of the cathode ray tube (CRT) inside the receiver set.  The system was first demonstrated to the Federal Communications Commission (FDC) on the 29th of August, 1940, and shown to the press on the 4th of September, 1940. 

CBS began experimental colour field tests using film as early as the 28th of August, 1940, and live cameras by the 12th of November, 1940. The National Broadcasting Company (NBC) (which is owned by Radio Corporation of America (RCA) made its first field test of colour TV on the 20th of February, 1941.  CBS began daily colour field tests on the 1st of June, 1941.  These colour systems were not compatible with existing black-and-white television sets, and, as no colour TV sets were available to the public at this time, viewing of the colour field tests was restricted to RCA and CBS engineers and the invited press.  The War Production Board halted the manufacture of TV and radio equipment for civilian use from the 22nd of April, 1942 to the 20th of August, 1945, limiting any opportunity to introduce colour TV to the general public.

As early as 1940, Baird had started work on a fully electronic system he called Telechrome. Early Telechrome devices used two electron guns aimed at either side of a phosphor plate.  The phosphor was patterned so the electrons from the guns only fell on one side of the patterning or the other.  Using cyan and magenta phosphors, a reasonable limited-colour image could be obtained.  He also demonstrated the same system using monochrome signals to produce a 3D image (called stereoscopic at the time).  A demonstration on the 16th of August.  1944 was the first example of a practical colour TV system.  Work on the Telechrome continued and plans were made to introduce a three-gun version for full colour.  However, Baird’s untimely death in 1946 ended the development of the Telechrome system.  Similar concepts were common through the 1940’s and 1950’s, differing primarily in the way they re-combined the colours generated by the three guns.  The Geer tube was similar to Baird’s concept but used small pyramids with the phosphors deposited on their outside faces, instead of Baird’s 3D patterning on a flat surface.  The Penetron used three layers of phosphor on top of each other and increased the power of the beam to reach the upper layers when drawing those colours.  The Chromatron used a set of focusing wires to select the coloured phosphors arranged in vertical stripes on the tube.

One of the great technical challenges of introducing colour broadcast TV was the desire to conserve bandwidth, potentially three times that of the existing black-and-white standards, and not use an excessive amount of radio spectrum.  In the United States (U.S.), after considerable research, the National Television Systems Committee (NTSC) approved an all-electronic system developed by RCA, which encoded the colour information separately from the brightness information and greatly reduced the resolution of the colour information to conserve bandwidth.  As black-and-white TV’s could receive the same transmission and display it in black-and-white, the colour system adopted is backwards compatible.  Compatible Colour, featured in RCA advertisements of the period, is mentioned in the song America, of West Side Story, 1957.  The bright image remained compatible with existing black-and-white TV sets at slightly reduced resolution, while colour TV’s could decode the extra information in the signal and produce a limited-resolution colour display.  The higher-resolution black-and-white and lower-resolution colour images combine in the brain to produce a seemingly high-resolution colour image.  The NTSC standard represented a major technical achievement.

The first colour broadcast was the first episode of the live program The Marriage on the 8th of July, 1954.  During the following ten years most network broadcasts, and nearly all local programming, continued to be in black-and-white.  It was not until the mid-1960s that colour sets started selling in large numbers, due in part to the colour transition of 1965 in which it was announced that over half of all network prime-time programming would be broadcast in colour that autumn.  The first all-color prime-time season came just one year later.  In 1972, the last holdout among daytime network programs converted to colour, resulting in the first completely all-colour network season.

Early colour sets were either floor-standing console models or tabletop versions nearly as bulky and heavy, so in practice, they remained firmly anchored in one place.  General Electric’s (GE) relatively compact and lightweight Porta-Colour set was introduced in the spring of 1966.  It used a transistor-based ultrahigh-frequency (UHF) tuner.  The first fully transistorised colour television in the United States was the Quasar TV introduced in 1967.   These developments made watching colour television a more flexible and convenient proposition.

In 1972, sales of colour sets finally surpassed sales of black-and-white sets.  Colour broadcasting in Europe was not standardized on the Phase Alternate Line (PAL) format until the 1960’s, and broadcasts did not start until 1967.  By this point, many of the technical issues in the early sets had been worked out, and the spread of colour sets in Europe was fairly rapid.  By the mid-1970’s, the only stations broadcasting in black-and-white were a few high-numbered UHF stations in small markets and a handful of low-power repeater stations in even smaller markets such as vacation spots.  By 1979, even the last of these had converted to colour and, by the early 1980’s, black and white sets had been pushed into niche markets, notably low-power uses, small portable sets, or for use as video monitor screens in lower-cost consumer equipment.  By the late 1980’s even these areas switched to colour sets.

 

Image © Kskhh via Wikipedia

A 40″ Samsung Full HD LED TV.

Image © Denelson83 via Wikipedia and is in the public domain

SMPTE colour bars.

These are used in a test pattern, sometimes when no programme material is available.

Digital Television 

Read more about Digital Television here and here.

Digital television (DTV)  is the transmission of audio and video by digitally processed and multiplexed signals, in contrast to the totally analogue and channel-separated signals used by analogue television (TV).  Due to data compression, digital TV can support more than one programme in the same channel bandwidth.  It is an innovative service that represents the most significant evolution in TV broadcast technology since colour TV emerged in the 1950’s.  Digital TV’s roots have been tied very closely to the availability of inexpensive, high-performance computers.  It was not until the 1990’s that digital TV became possible.  Digital TV was previously not practically possible due to the impractically high bandwidth requirements of uncompressed digital video, requiring around 200 Mbit/s for a standard-definition television (SDTV) signal, and over 1 Gbit/s for high-definition television (HDTV).

A digital TV service was proposed in 1986 by Nippon Telegraph and Telephone (NTT) and the Ministry of Posts and Telecommunication (MPT) in Japan, where there were plans to develop an Integrated Network System service.  However, it was not possible to practically implement such a digital TV service until the adoption of Discrete Cosine Transform (DCT) video compression technology made it possible in the early 1990’s.

In the mid-1980’s, as Japanese consumer electronics firms forged ahead with the development of HDTV technology, the MUSE analogue format proposed by Japan Broadcasting Corporation (also known as NHK), a Japanese company, was seen as a pacesetter that threatened to eclipse United States (U.S.) electronics companies’ technologies.  Until June 1990, the Japanese MUSE standard, based on an analogue system, was the front-runner among the more than 23 other technical concepts under consideration.  Then, a U.S. company, General Instrument, demonstrated the possibility of a digital TV signal.  This breakthrough was of such significance that the Federal Communications Commission (FCC) was persuaded to delay its decision on an Associated Television (ATV) standard until a digitally-based standard could be developed.

In March 1990, when it became clear that a digital standard was possible, the FCC made a number of critical decisions.  First, the Commission declared that the new ATV standard must be more than an enhanced analogue signal, but be able to provide a genuine HDTV signal with at least twice the resolution of existing TV images.  Then, to ensure that viewers who did not wish to buy a new digital TV set could continue to receive conventional TV broadcasts, it dictated that the new ATV standard must be capable of being simulcast on different channels.  The new ATV standard also allowed the new definition television (DTV) signal to be based on entirely new design principles.  Although incompatible with the existing National Television Standards Committee (NTSC) standard, the new DTV standard would be able to incorporate many improvements.

The last standards adopted by the FCC did not require a single standard for scanning formats, aspect ratios, or lines of resolution.  This compromise resulted from a dispute between the consumer electronics industry (joined by some broadcasters) and the computer industry (joined by the film industry and some public interest groups) over which of the two scanning processes (interlaced or progressive) would be best suited for the newer digital HDTV compatible display devices.  Interlaced scanning, which had been specifically designed for older analogue cathode ray tube (CRT) display technologies, scans even-numbered lines first, then odd-numbered ones.  In fact, interlaced scanning can be looked at as the first video compression model as it was partly designed in the 1940’s to double the image resolution to exceed the limitations of the TV broadcast bandwidth.  Another reason for its adoption was to limit the flickering on early CRT screens whose phosphor-coated screens could only retain the image from the electron scanning gun for a relatively short duration.  However, interlaced scanning does not work as efficiently on newer devices such as Liquid-crystal display (LCD), for example, which are better suited to a more frequent progressive refresh rate.

Progressive scanning, the format that the computer industry had long adopted for computer display monitors, scans every line in sequence, from top to bottom.  Progressive scanning in effect doubles the amount of data generated for every full screen displayed in comparison to interlaced scanning by painting the screen in one pass in 1/60-second, instead of two passes in 1/30-second.  The computer industry argued that progressive scanning is superior because it does not flicker on the new standard of display devices in the manner of interlaced scanning.  It also argued that progressive scanning enables easier connections with the Internet, and is more cheaply converted to interlaced formats than vice versa.  The film industry also supported progressive scanning because it offered a more efficient means of converting filmed programming into digital formats.  For their part, the consumer electronics industry and broadcasters argued that interlaced scanning was the only technology that could transmit the highest quality pictures then (and currently) feasible, i.e., 1,080 lines per picture and 1,920 pixels per line.  Broadcasters also favoured interlaced scanning because their vast archive of interlaced programming is not readily compatible with a progressive format.  William F. Schreiber, who was director of the Advanced Television Research Program at the Massachusetts Institute of Technology from 1983 until his retirement in 1990, thought that the continued advocacy of interlaced equipment originated from consumer electronics companies that were trying to get back the substantial investments they made in the interlaced technology.

The digital TV transition started in the late 2000’s.  All governments across the world set the deadline for analogue shutdown by 2010’s.  Initially, the adoption rate was low, as the first digital tuner-equipped TV sets were costly but soon, as the price of digital-capable TV sets dropped, more and more households were converting to digital TV sets. 

Smart Television

Read more about Smart Television here.

The advent of digital television (TV) allowed innovations like smart TV sets.  A smart television, sometimes referred to as a connected TV or hybrid TV, is a TV set or set-top box with integrated Internet and Web 2.0 features, and is an example of technological convergence between computers, television sets and set-top boxes.  Besides the traditional functions of TV sets and set-top boxes provided through traditional Broadcasting media, these devices can also provide Internet TV, online interactive media, over-the-top content, as well as on-demand streaming media, and home networking access.  These TV’s come pre-loaded with an operating system.

Smart TV is not to be confused with Internet TV, Internet Protocol television or Web TV.  Internet television refers to the receiving of television content over the Internet instead of by traditional systems such as terrestrial, cable and satellite (although the Internet itself is received by these methods).  Internet protocol television (IPTV) is one of the emerging Internet television technology standards for use by TV  networks.  Web TV is a term used for programs created by a wide variety of companies and individuals for broadcast on Internet TV.  A first patent was filed in 1994 (and extended the following year) for an intelligent TV system, linked with data processing systems, by means of a digital or analogue network.  Apart from being linked to data networks, one key point is its ability to automatically download necessary software routines, according to a user’s demand, and process their needs.  Major TV manufacturers announced the production of smart TV’s only, for middle-end and high-end TV’s in 2015.   Smart TV’s have gotten more affordable compared to when they were first introduced, with 46 million United States (U.S.) households having at least one as of 2019.

Image © LG via Wikipedia

An LG Smart TV.

3D Television 

Read more about 3D Television here.

3D television (3DTV) conveys depth perception to the viewer by employing techniques such as stereoscopic display, multi-view display, 2D-plus-depth, or any other form of 3D display.  Most modern 3D television (TV) sets use an active shutter 3D system or a polarised 3D system, and some are autostereoscopic without the need for glasses.  Stereoscopic 3D television was demonstrated for the first time on the 10th of August, 1928, by John Logie Baird in his company’s premises at 133 Long Acre, London.  Baird pioneered a variety of 3D television systems using electromechanical and cathode-ray tube (CRT) techniques.  The first 3D TV was produced in 1935.  The advent of digital TV in the 2000’s greatly improved 3D TV sets.  Although 3D TV sets are quite popular for watching 3D home media such as on Blu-ray discs, 3D programming has largely failed to make inroads with the public.  Many 3D TV channels which started in the early 2010’s were shut down by the mid-2010’s.  According to DisplaySearch 3D TV shipments totaled 41.45 million units in 2012, compared with 24.14 in 2011 and 2.26 in 2010.  As of late 2013, the number of 3D TV viewers started to decline.

Broadcast Systems

Terrestrial Television

Read more about Terrestrial Television here and here.

Programming is broadcast by television (TV) stations, sometimes called channels, as stations are licensed by their governments to broadcast only over assigned channels in the TV band.  At first, terrestrial broadcasting was the only way TV could be widely distributed, and because bandwidth was limited, i.e., there were only a small number of channels available, government regulation was the norm.  In the United States (U.S.), the Federal Communications Commission (FCC) allowed stations to broadcast advertisements beginning in July 1941 but required public service programming commitments as a requirement for a license.  By contrast, the United Kingdom (U.K.) chose a different route, imposing a TV license fee on owners of TV reception equipment to fund the British Broadcasting Corporation (BBC) which had public service as part of its Royal Charter.

WRGB claims to be the world’s oldest TV station, tracing its roots to an experimental station founded on the 13th of January, 1928, broadcasting from the General Electric (G.E.) factory in Schenectady, New York, U.S.  under the call letters W2XB.  It was popularly known as WGY Television after its sister radio station.  Later in 1928, G.E. started a second facility, this one in New York City, which had the call letters W2XBS and which today is known as WNBC.  The two stations were experimental in nature and had no regular programming, as receivers were operated by engineers within the company.  The image of a Felix the Cat doll rotating on a turntable was broadcast for two hours every day for several years as new technology was being tested by the engineers.  On the 2nd of November 1936, the BBC began transmitting the world’s first public regular high-definition service from the Victorian Alexandra Palace in north London.   It therefore claims to be the birthplace of TV broadcasting as we now know it.

With the widespread adoption of cable across the U.S. in the 1970’s and 1980’s, terrestrial TV broadcasts have been in decline.  In 2013 it was estimated that about 7% of U.S. households used an antenna.  A slight increase in use began around 2010 due to the switchover to digital terrestrial TV broadcasts, which offered pristine image quality over very large areas and offered an alternative to cable TV (CATV) for cord-cutters.  All other countries around the world are also in the process of either shutting down analogue terrestrial TV or switching over to digital terrestrial TV.

Image © Tennen-Gas via Wikipedia

A modern high-gain UHF Yagi television antenna.

This antenna is used for UHF HDTV reception.  The antenna’s main lobe is off the right end of the antenna and it is most sensitive to stations in that direction.  Each of the metal crossbars along the antenna support boom is called an element, which acts as a half-wave dipole resonator for the radio waves.  The antenna has one driven element which is attached to the TV and it is behind the black box.  The black box is a preamplifier which increases the power of the TV signal before it is sent to the TV set.  The 17 elements to the right of the driven element are called directors.  They reinforce the signal.   The 4 elements on the V-shaped boom are called a corner reflector and they serve to reflect the signal back toward the driven element. 

Yagi HDTV antennas use a corner reflector to increase the bandwidth of the antenna.  The rest of the antenna increases the gain at higher channels, while the corner reflector increases the gain at lower channels.

Cable Television

Read more about Cable Television here and here.

Cable television (CATV) is a system of broadcasting television (TV) programming to paying subscribers via radio frequency (RF) signals transmitted through coaxial cables or light pulses through fibre-optic cables.  This contrasts with traditional terrestrial TV, in which the TV signal is transmitted over the air by radio waves and received by a television antenna attached to the TV.  In the 2000’s, frequency modulation (FM) radio programming, high-speed Internet, telephone service, and similar non-television services may also be provided through these cables.  The abbreviation CATV is used for cable television in the United States (U.S.).   It originally stood for Community Access Television or Community Antenna Television, from cable television’s origins in 1948, in areas where over-the-air reception was limited by distance from transmitters or mountainous terrain, large community antennas were constructed, and cable was run from them to individual homes.

Image © Peter Trieb via Wikipedia and is in the public domain

Coaxial cable.

This cable is used to carry cable television signals into cathode-ray tubes and flat-panel TV sets.

Satellite Television

Read more about Satellite Television here.

Satellite television is a system of supplying television (TV) programming using broadcast signals relayed from communication satellites.  The signals are received via an outdoor parabolic reflector antenna usually referred to as a satellite dish and a low-noise block downconverter.  A satellite receiver then decodes the desired TV program for viewing on a television set.  Receivers can be external set-top boxes or a built-in TV tuner.  Satellite TV provides a wide range of channels and services, especially to geographic areas without terrestrial TV or cable TV (CATV).

The most common method of reception is direct-broadcast satellite TV, also known as direct-to-home.  In  direct-broadcast satellite television  (DBSTV) systems, signals are relayed from a direct broadcast satellite on the Ku wavelength and are completely digital.  Satellite TV systems formerly used systems known as TV receive-only.  These systems received analogue signals transmitted in the C-band spectrum from fixed-satellite service (FSS) type satellites and required the use of large dishes.  Consequently, these systems were nicknamed big dish systems and were more expensive and less popular.

The direct-broadcast satellite (DBS) TV signals were earlier analogue signals and later digital signals, both of which require a compatible receiver.  Digital signals may include high-definition television (HDTV).  Some transmissions and channels are free-to-air or free-to-view, while many other channels are pay-for television requiring a subscription.  In 1945, British science fiction writer Arthur C. Clarke proposed a worldwide communications system which would function by means of three satellites equally spaced apart in Earth’s orbit.  This was published in the October 1945 issue of the Wireless World magazine and won him the Franklin Institute’s Stuart Ballantine Medal in 1963.

The first satellite TV signals from Europe to North America were relayed via the Telstar satellite over the Atlantic Ocean on the 23rd of July. 1962.  The signals were received and broadcast in North American and European countries and watched by over 100 million.  Launched in 1962, the Relay 1 satellite was the first satellite to transmit TV signals from the U.S. to Japan.  The first geosynchronous communication satellite, Syncom 2, was launched on the 26th of July 1963.

The world’s first commercial communications satellite, called Intelsat I nicknamed Early Bird, was launched into geosynchronous orbit on the 6th of April. 1965.  The first national network of TV satellites, called Orbita, was created by the Soviet Union in October 1967 and was based on the principle of using the highly elliptical Molniya satellite for rebroadcasting and delivering television signals to ground downlink stations.  The first commercial North American satellite to carry TV transmissions was Canada’s geostationary Anik 1, which was launched on the 9th of November, 1972.  ATS-6, the world’s first experimental educational and Direct Broadcast Satellite, was launched on the 30th of May, 1974.   It transmitted at 860 MHz using wideband frequency modulation (FM) and had two sound channels.  The transmissions were focused on the Indian subcontinent but experimenters were able to receive the signal in Western Europe using home-constructed equipment that drew on Ultra high frequency  (UHF) television design techniques already in use.

The first in a series of Soviet geostationary satellites to carry Direct-To-Home television, Ekran 1, was launched on the 26th of October, 1976.  It used a 714 MHz UHF downlink frequency so that the transmissions could be received with existing UHF television technology rather than microwave technology.

Image © Brian Katt via Wikipedia

DBS satellite dishes.

These Dishes are installed on an apartment complex in San Jose, California,  U.S.A.

Internet Television

Read more about Internet Television here.

Internet television (or online television) is the digital distribution of television (TV) content via the Internet as opposed to traditional systems like terrestrial, cable, and satellite, although the Internet itself is received by terrestrial, cable, or satellite methods.  Internet television is a general term that covers the delivery of television series, and other video content, over the Internet by video streaming technology, typically by major traditional television broadcasters.  Internet television should not be confused with Smart TV, Internet Protocol Television (IPTV) or Web TV.  Smart television refers to a television set which has a built-in operating system.  IPTV is one of the emerging Internet television technology standards for use by television networks.  Web television is a term used for programs created by a wide variety of companies and individuals for broadcast on Internet television.

Television Sets

Read more about Television Sets here.

A television set, also called a television receiver, television (TV), TV set, or telly, is a device that combines a tuner, display, amplifier, and speakers for the purpose of viewing television and hearing its audio components.  Introduced in the late 1920’s in mechanical form, television sets became a popular consumer product after World War II in electronic form, using cathode-ray tubes (CRT).  The addition of colour to broadcast television after 1953 further increased the popularity of TV sets and an outdoor antenna became a common feature of suburban homes. The ubiquitous TV set became the display device for recorded media in the 1970’s, such as Betamax and Video Home System (VHS), which enabled viewers to record TV shows and watch prerecorded movies.  In the subsequent decades, TV sets were used to watch digital versatile discs (DVD) and Blu-ray Discs of movies and other content.  Major TV manufacturers announced the discontinuation of CRT, Digital Light Processing (DLP), plasma and fluorescent-backlit liquid-crystal displays (LCD) by the mid-2010’s.  Telly’s since 2010’s mostly used light-emitting diodes (LED).  These are expected to be gradually replaced by organic light-emitting diodes (OLED) in the near future.

Image © Fletcher6 via Wikipedia

An RCA Model 630-TS Television.

The RCA 630-TS was the first mass-produced television set.  It was sold in 1946 – 1947.

Display Technologies

Read more about Display Technologies here.

Disk

Read more about Disk here.

The earliest systems employed a spinning disk to create and reproduce images.  These usually had a low resolution and screen size and never became popular with the public.

CRT

Read more about CRT here.

The cathode-ray tube (CRT) is a vacuum tube used in a television (TV) containing one or more electron guns (a source of electrons or electron emitter) and a fluorescent screen used to view images.  It has a means to accelerate and deflect electron beams onto the screen to create the images.  The images may represent electrical waveforms (oscilloscope), pictures (tv, computer monitor), radar targets or others.  The cathode ray tube (CRT) uses an evacuated glass envelope which is large, deep (i.e. long from front screen face to rear end), fairly heavy, and relatively fragile.  As a matter of safety, the face is typically made of thick lead glass so as to be highly shatter-resistant and to block most X-ray emissions, particularly if the CRT is used in a consumer product.

In television sets and computer monitors, the entire front area of the tube is scanned repetitively and systematically in a fixed pattern called a raster.  An image is produced by controlling the intensity of each of the three electron beams, one for each additive primary colour (red, green, and blue) with a video signal as a reference.  In all modern C.R.T. monitors and televisions, the beams are bent by magnetic deflection, a varying magnetic field generated by coils and driven by electronic circuits around the neck of the tube, although electrostatic deflection is commonly used in oscilloscopes, a type of diagnostic instrument.

A 14″ cathode-ray tube.

This LG.Philips cathode-ray tubes show their deflection coils and electron guns.

DLP

Image © Blue tooth7 via Wikipedia

Read more about DLP here.

Digital Light Processing (DLP) is a type of video projector technology that uses a digital micromirror device.  Some DLP’s have a television (TV) tuner, which makes them a type of TV display.  It was originally developed in 1987 by Dr. Larry Hornbeck of Texas Instruments.  While the  Digital Light Processing (DLP) imaging device was invented by Texas Instruments, the first DLP-based projector was introduced by Digital Projection Ltd in 1997.  Digital Projection and Texas Instruments were both awarded Emmy Awards in 1998 for the invention of the DLP projector technology.  DLP is used in a variety of display applications from traditional static displays to interactive displays and non-traditional embedded applications including medical, security, and industrial uses.  DLP technology is used in DLP front projectors (standalone projection units for classrooms and businesses primarily), but also in private homes.  In these cases, the image is projected onto a projection screen.  DLP is also used in DLP rear projection TV sets and digital signs.  It is also used in about 85% of digital cinema projection.

Image © Dave Pape via Wikipedia and is in the public domain

A Christie Mirage 5000 DLP projector.

This projector made by Christie is circa 2001.  It was one of four being used in the CAVE virtual reality system at EVL in Chicago, U.S.A. and was capable of 120 Hz field-sequential stereo at 1280×1024 resolution, with 5000 lumens brightness.

Plasma

Read more about Plasma here.

A plasma display panel (PDP) is a type of flat panel display common to large television (TV) displays 30 inches (76 cm) or larger.  They are called plasma displays because the technology uses small cells containing electrically charged ionised gases, or what are in essence chambers more commonly known as fluorescent lamps.

LCD

Read more about LCD here.

Liquid-crystal-display (LCD) televisions are television (TV) sets that use LCD display technology to produce images.  LCD TV’s are much thinner and lighter than cathode-ray tubes (CRT) of similar display size and are available in much larger sizes (e.g., 90-inch diagonal).  When manufacturing costs fell, this combination of features made LCD’s practical for TV receivers.  LCD’s come in two types, those using cold cathode fluorescent lamps, simply called LCD’s and those using light-emitting diodes (LED) as a backlight called LED’s.

In 2007, LCD TV sets surpassed sales of CRT-based TV sets worldwide for the first time, and their sales figures relative to other technologies accelerated.  LCD TV sets have quickly displaced the only major competitors in the large-screen market, the Plasma display panel and rear-projection TV.  In mid-2010’s LCD’s especially LED’s became, by far, the most widely produced and sold TV display type.  LCD’s also have disadvantages.  Other technologies address these weaknesses, including organic light-emitting diode (OLED), field emission display (FED) and surface-conduction electron-emitter display (SED) TV’s, but as of 2014 none of these have entered widespread production.

OLED

Read more about OLED here.

An organic light-emitting diode (OLED) is a light-emitting diode in which the emissive electroluminescent layer is a film of organic compound which emits light in response to an electric current.  This layer of organic semiconductor is situated between two electrodes.  Generally, at least one of these electrodes is transparent.  OLED’s are used to create digital displays in devices such as television (TV) screens.  It is also used for computer monitors, and portable systems such as mobile phones, handheld game consoles and personal digital assistants (PDA).

There are two main groups of OLED, those based on small molecules and those employing polymers.  Adding mobile ions to an OLED creates a light-emitting electrochemical cell (LEC), which has a slightly different mode of operation.  OLED displays can use either passive-matrix or active-matrix addressing schemes.  Active-matrix OLED’s require a thin-film transistor backplane to switch each individual pixel on or off but allow for higher resolution and larger display sizes.

An OLED display works without a backlight.  Thus, it can display deep black levels and can be thinner and lighter than a liquid crystal display (LCD).  In low ambient light conditions such as a dark room, an OLED screen can achieve a higher contrast ratio than an LCD, whether it uses cold cathode fluorescent lamps or a light-emitting diode (LED) backlight.  OLED’s are expected to replace other forms of display in the near future.

Image © LG via Wikipedia

An LG 3D OLED TV.

Display Resolution

LDTV

Read more about LDTV here.

Low-definition television (LDTV) refers to television (TV) systems that have a lower screen resolution than standard-definition TV systems such 240p (320*240).  It is used in handheld tellies.  The most common source of LDTV programming is the Internet, where mass distribution of higher-resolution video files could overwhelm computer servers and take too long to download.  Many mobile phones and portable devices such as Apple’s iPod Nano, or Sony’s PlayStation Portable use LDTV video, as higher-resolution files would be excessive to the needs of their small screens (320×240 and 480×272 pixels respectively).  The current generation of iPod Nanos has LDTV screens, as do the first three generations of iPod Touch and iPhone (480×320).  For the first years of its existence, YouTube offered only one, low-definition (LD) resolution of 320x240p at 30fps or less.  A standard, consumer-grade videotape can be considered a standard-definition television (SDTV) due to its resolution (approximately 360 × 480i/576i).

Image © Libron via Wikipedia and is in the public domain

A comparison of 8K UHDTV, 4K UHDTV, HDTV and SDTV resolution.

SDTV

Read more about SDTV here.

Standard-definition television (SDTV) refers to two different resolutions, 576i, with 576 interlaced lines of resolution, derived from the European-developed Phase Alternating Line (PAL) and Sequentiel de couleur a memoir (french for colour sequential with memory) (SECAM) systems, and 480i based on the American National Television System Committee (NTSC) system.  SDTV is a television (TV) system that uses a resolution that is not considered to be either high-definition television (HDTV) (720p, 1080i, 1080p, 1440p, 4K ultra high-definition television (UHDTV), and 8K ultra-high definition (UHD) or enhanced-definition television (EDT.V 480p).  In North America, digital SDTV is broadcast in the same 4:3 aspect ratio as National Television Standards Committee  (NTSC) signals with widescreen content being centre cut.  However, in other parts of the world that used the PAL or SECAM colour systems, SDTV is now usually shown with a 16:9 aspect ratio, with the transition occurring between the mid-1990’s and mid-2000’s.  Older programs with a 4:3 aspect ratio are shown in the United States (U.S.) as 4:3 with non-Advanced Television Systems Committee (ATSC) countries preferring to reduce the horizontal resolution by anamorphically scaling a pillarboxed image.

HDTV

Read more about HDTV here

High-definition television (HDTV) provides a resolution that is substantially higher than that of standard-definition television (SDTV).

HDTV may be transmitted in various formats:

1080p: 1920×1080p: 2,073,600 pixels (~2.07 megapixels) per frame.

1080i: 1920×1080i: 1,036,800 pixels (~1.04 MP) per field or 2,073,600 pixels (~2.07 MP) per frame.

A non-standard CEA resolution exists in some countries such as 1440×1080i: 777,600 pixels (~0.78 MP) per field or 1,555,200 pixels (~1.56 MP) per frame.

720p: 1280×720p: 921,600 pixels (~0.92 MP) per frame.

UHDTV

Read more about UHDTV here.

Ultra-high-definition television (UHDTV), also known as Super Hi-Vision,  UltraHD or UHD  includes 4K UHD (2160p) and 8K ultra-high definition (UHD) (4320p), which are two digital video formats proposed by NHK Science & Technology Research Laboratories and defined and approved by the International Telecommunication Union (ITU). The Consumer Electronics Association (CTA) announced on the 17th of October, 2012, that UHD, or Ultra HD, would be used for displays that have an aspect ratio of at least 16:9 and at least one digital input capable of carrying and presenting natural video at a minimum resolution of 3840×2160 pixels.

Content

Television Programming

Read more about Television Programming here, here and here.

Getting television (TV) programming shown to the public can happen in many other ways.  After production, the next step is to market and deliver the product to whichever markets are open to using it.  This typically happens on two levels:

Original run or First run (a producer creates a programme of one or multiple episodes and shows it on a station or network which has either paid for the production itself or to which a license has been granted by the TV producers to do the same).

Broadcast syndication  (this is the terminology rather broadly used to describe secondary programming usages i.e. beyond its original run.  It includes secondary runs in the country of the first issue, but also international usage which may not be managed by the originating producer.  In many cases, other companies, TV stations, or individuals are engaged to do the syndication work, in other words, to sell the product into the markets they are allowed to sell into by contract from the copyright holders, in most cases the producers).

First-run programming is increasing on subscription services outside of the United States (U.S.), but few domestically produced programs are syndicated on domestic free-to-air (FTA) elsewhere.  This practice is increasing, however, generally on digital-only FTA channels or with subscriber-only, first-run material appearing on FTA.  Unlike the U.S., repeat FTA screenings of an FTA network program usually only occur on that network.  Also, affiliates rarely buy or produce non-network programming that is not focused on local programming.

Television Genres

Television (TV)  genres include a broad range of programming types that entertain, inform, and educate viewers.  The most expensive entertainment genres to produce are usually dramas and dramatic miniseries.  However, other genres, such as historical Western genres, may also have high production costs.

Pop culture entertainment genres include action-oriented shows such as police, crime, detective dramas, horror, or thriller shows.  As well, there are also other variants of the drama genre, such as medical dramas and daytime soap operas.  Sci-fi series can fall into either the drama or action category, depending on whether they emphasise philosophical questions or high adventure.  Comedy is a popular genre which includes situation comedy (sitcom) and animated series for the adult demographic such as Comedy Central’s South Park.

The least expensive forms of entertainment programming genres are game shows, talk shows, variety shows, and reality TV.  Game shows feature contestants answering questions and solving puzzles to win prizes.  Talk shows contain interviews with film, TV, music and sports celebrities and public figures.  Variety shows feature a range of musical performers and other entertainers, such as comedians and magicians, introduced by a host or Master of Ceremonies.  There is some crossover between some talk shows and variety shows because leading talk shows often feature performances by bands, singers, comedians, and other performers in between the interview segments.  Reality TV series regular people (i.e., not actors) facing unusual challenges or experiences ranging from arrest by police officers to significant weight loss.  A derived version of reality shows depicts celebrities doing mundane activities such as going about their everyday life or doing regular jobs. 

Fictional TV programmes that some telly scholars and broadcasting advocacy groups argue are quality TV programmes include series such as The Sopranos.  Kristin Thompson argues that some of these television series exhibit traits also found in art films, such as psychological realism, narrative complexity, and ambiguous plot lines.  Nonfiction TV programmes that some telly scholars and broadcasting advocacy groups argue are quality television programmes, include a range of serious, noncommercial, programming aimed at a niche audience, such as documentaries and public affairs shows. 

Television Funding

Around the world, broadcast television (TV) is financed by government, advertising, licensing (a form of tax), subscription, or any combination of these.  To protect revenues, subscription TV channels are usually encrypted to ensure that only subscribers receive the decryption codes to see the signal.  Unencrypted channels are known as free-to-air (FTA).  In 2009, the global TV market represented 1,217.2 million TV households with at least one TV and total revenues of 268.9 billion EUR (declining 1.2% compared to 2008).  North America had the biggest TV revenue market share with 39% followed by Europe (31%), Asia-Pacific (21%), Latin America (8%), and Africa and the Middle East (2%).  Globally, the different TV revenue sources are divided into 45–50% TV advertising revenues, 40–45% subscription fees and 10% public funding.

Television Advertising

Read more about Television advertising here

Television’s broad reach makes it a powerful and attractive medium for advertisers. Many television (TV) networks and stations sell blocks of broadcast time to advertisers (sponsors) to fund their programming.  Television advertisements (also called a TV commercial, commercial, ad and an advert) is a span of TV programming produced and paid for by an organisation, which conveys a message, typically to market a product or service.  Advertising revenue provides a significant portion of the funding for most privately owned TV networks.  The vast majority of TV ads today consist of brief advertising spots, ranging in length from a few seconds to several minutes (as well as programme-length infomercials).  Adverts of this sort have been used to promote a wide variety of goods, services and ideas since the beginning of TV.

The effects of TV advertising upon the viewing public (and the effects of mass media in general) have been the subject of discourse by philosophers including Marshall McLuhan.  The viewership of TV programming, as measured by companies such as Nielsen Media Research, is often used as a metric for TV  advertisement placement, and consequently, for the rates charged to advertisers to air within a given network, television programme, or time of day (called a daypart).  In many countries, including the United States (U.S.), TV campaign advertisements are considered indispensable for a political campaign.  In other countries, such as France, political advertising on the telly is heavily restricted, while some countries, such as Norway, completely ban political adverts.

The first official, paid television ad was broadcast in the U.S. on the 1st of July, 1941, over New York station WNBT (now WNBC) before a baseball game between the Brooklyn Dodgers and Philadelphia Phillies.  The announcement for Bulova watches, for which the company paid anywhere from $4.00 to $9.00 (reports vary), displayed a WNBT test pattern modified to look like a clock with the hands showing the time.  The Bulova logo, with the phrase Bulova Watch Time, was shown in the lower right-hand quadrant of the test pattern while the second hand swept around the dial for one minute.  The first TV ad broadcast in the United Kingdom (U.K.) was on ITV on the 22nd of September, 1955, advertising Gibbs SR toothpaste.  The first TV ad broadcast in Asia was on Nippon Television in Tokyo on the 28th of August, 1953, advertising Seikosha (now Seiko), which also displayed a clock with the current time.

Image via Swtpc6800 on Wikipedia and is in the public domain

Radio News cover, September, 1928.

Television was still in its experimental phase in 1928, but the medium’s potential to sell goods was already predicted.  It was seen as the ideal television of the future but these early experimental televisions could not maintain synchronisation with the camera.  The viewer had to constantly make adjustments as seen by the sync control in the man’s hand.  

United Kingdom

The television (TV) regulator oversees TV advertising in the United Kingdom (U.K.).  Its restrictions have applied since the early days of commercially funded TV.  Despite this, an early TV mogul, Roy Thomson, likened the broadcasting licence to being a licence to print money.  Restrictions mean that the big three national commercial TV channels ITV, Channel 4, and Channel 5 can show an average of only seven minutes of advertising per hour (eight minutes in the peak period).  Other broadcasters must average no more than nine minutes (twelve in the peak).  This means that many imported TV shows from the United States (U.S.) have unnatural pauses where a British company does not use the narrative breaks intended for more frequent U.S. advertising.  Advertisements must not be inserted in the course of certain specific proscribed types of programmes which last less than half an hour in scheduled duration.  This list includes any news or current affairs programmes, documentaries, and programmes for children.  Additionally, ads may not be carried in a programme designed and broadcast for reception in schools in any religious broadcasting service or other devotional program or during a formal Royal ceremony or occasion.  There also must be clear demarcations in time between the programmes and the adverts.  The British Broadcasting Company (BBC), being strictly non-commercial, is not allowed to show advertisements on TV in the U.K., although it has many advertising-funded channels abroad.  The majority of its budget comes from TV license fees and broadcast syndication, the sale of content to other broadcasters.

United States

Since its inception in the United States (U.S.) in 1941, television (TV) commercials have become one of the most effective, persuasive, and popular methods of selling products of many sorts, especially consumer goods.  During the 1940’s and into the 1950’s, programmes were hosted by single advertisers.  This, in turn, gave great creative control to the advertisers over the content of the show.  Perhaps due to the quiz show scandals in the 1950’s, networks shifted to the magazine concept, introducing advertising breaks with other advertisers.

U.S. advertising rates are determined primarily by Nielsen ratings.  The time of the day and popularity of the channel determine how much a TV commercial can cost.  For example, it can cost approximately $750,000 for a 30-second block of commercial time during the highly popular singing competition American Idol, while the same amount of time for the Super Bowl can cost several million dollars. Conversely, lesser-viewed time slots, such as early mornings and weekday afternoons, are often sold in bulk to producers of infomercials at far lower rates.  In recent years, the paid programme or infomercial has become common, usually in lengths of 30 minutes or one hour.  Some drug companies and other businesses have even created news items for broadcast, known in the industry as video news releases, paying programme directors to use them.

Some TV programmes also deliberately place products into their shows as advertisements, a practice started in feature films and is known as product placement.  For example, a character could be drinking a certain kind of pop, going to a particular chain restaurant, or driving a certain make of car.  This is sometimes very subtle, with shows having vehicles provided by manufacturers for low cost in exchange for product placement.  Sometimes, a specific brand or trade mark, or music from a certain artist or group, is used.   This excludes guest appearances by artists who perform on the show.

Ireland

Broadcast advertising is regulated by the Broadcasting Authority of Ireland.

Subscription 

Some television (TV) channels are partly funded from subscriptions, therefore, the signals are encrypted during the broadcast to ensure that only the paying subscribers have access to the decryption codes to watch pay television or speciality channels.  Most subscription services are also funded by advertising.

Taxation Or License

Television (TV) services in some countries may be funded by a TV licence or a form of taxation, which means that advertising plays a lesser role or no role at all.  For example, some channels may carry no advertising at all and some very little, including:

Australia (ABC Television).

Belgium (VRT for Flanders and RTBF for Wallonia).

Denmark (DR).

Ireland (RTE).

Japan (NHK).

Norway (NRK).

Sweden (SVT).

Switzerland (SRG SSR).

Republic of China (Taiwan) (PTS).

United Kingdom (BBC).

United States (PBS).

Broadcast Programming

Read more about Broadcast Programming here and here.

Broadcast programming, or television (TV) listings in the United Kingdom (U.K.), is the practice of organising TV programmes in a schedule, with broadcast automation used to regularly change the scheduling of TV programmes to build an audience for a new show, retain that audience, or compete with other broadcasters’ programmes.

See Also

Blog Posts

Notes And Links

Article source: Wikipedia and is subject to change.

Max Rahubovskiy on Pexels –  The image shown at the top of this page is the copyright of Max Rahubovskiy.  You can find more great work from the photographer Max by clicking the link above and you can get lots more free stock photos at Pexels.

The image above of Flat-screen televisions in 2008 is the copyright of Wikipedia user Wags05.  It is in the Public Domain. 

The image above of the Nipkow Disk is the copyright of Wikipedia user Hzeller.   It comes with a Creative Commons licence (CC BY-SA 3.0).  

The image above of John Ferdinand Braun is copyright unknown and is in the Public Domain.

The image above of Vladimir Zworykin is copyright unknown and is in the Public Domain.

The image above of Manfred von Ardenne in 1933 is the copyright of unknown.   It comes with a Creative Commons licence (CC BY-SA 3.0).  

The image above of A Radio Corporation Of America 1939 Advertisement is copyright unknown and is in the Public Domain.

The image above of A 40″ Samsung Full HD LED TV is the copyright of Wikipedia user Kskhh.   It comes with a Creative Commons licence (CC BY-SA 4.0).  

The image above of SMPTE colour bars is the copyright of Wikipedia user Denelson83.  It is in the Public Domain. You can see more of his/her great work here.

The image above of an LG Smart TV is the copyright of Wikipedia user LG.  It comes with a Creative Commons licence (CC BY-SA 2.0).  You can see more of their great work here.

The image above of A modern high-gain UHF Yagi television antenna is the copyright of Wikipedia user Tennen-Gas.   It comes with a Creative Commons licence (CC BY-SA 3.0).  

The image above of Coaxial cable is the copyright of Wikipedia user Peter Trieb.  It is in the Public Domain. 

The image above of DBS satellite dishes is the copyright of Wikipedia user Brian Katt.   It comes with a Creative Commons licence (CC BY-SA 3.0).  You see more of his great work here.

The image above of an RCA Model 630-TS television is the copyright of Wikipedia user Fletcher6.   It comes with a Creative Commons licence (CC BY-SA 3.0).  You see more of his/her work here.

The image above of a 14″ cathode-ray tube is the copyright of Wikipedia user Blue tooth7.   It comes with a Creative Commons licence (CC BY-SA 3.0)

The image above of a Christie Mirage 5000 DLP projector is the copyright of Wikipedia user Dave Pape.  It is in the Public Domain. You can see more of his great work here.

The image above of an LG 3D OLED TV is the copyright of Wikipedia user LG.  It comes with a Creative Commons licence (CC BY-SA 2.0).  You can see more of their great work here.

The image above of a comparison of 8K UHDTV, 4K UHDTV, HDTV and SDTV resolution is the copyright of Wikipedia user Libron.  It is in the Public Domain.   

The image above of the Radio News cover, September, 1928 is provided by Wikipedia user Swtpc6800 and is in the Public Domain.

Birmingham: The Old Crown In Digbeth Photos (Part 2)

On Monday the 11th of September, 2023,  I visited the Old Crown in High Street, Digbeth, Birmingham, for the first time in my life, as part of Birmingham Heritage Week.  

Being a Brummie, born and bred, I have passed this pub a lot of times, especially as I got older and I always wondered what it would be like inside.  Although as an adult I could have popped in at any time I never got around to it until now.   I am pleased I saw, as part of Heritage Week, that this Medieval pub was presenting an exhibition on the 655-year history of Birmingham’s oldest pub.  It included never-before-seen photos and illustrations of the Grade-II* listed venue, as well as giving away a booklet by Carl Chinn.

I couldn’t really look around and appreciate how historic it is as much as I would have liked and take better shots inside of the old features because it was packed (and noisy) but I managed to take some decent enough photos to share.  Sadly, and bloody annoyingly, 19 photos didn’t turn out at all.   It had been a long day for me, coming from Edgbaston after doing a lot of walking around Cannon Hill Park (also another Heritage Week event) and it was a very hot day so my phone was heating up, on charge and playing up by now so that would explain that mystery.  It is just my usual bad luck but that’s a subject for another day!

I would have liked to have taken better ones outside too but there are seemingly never-ending roadworks going on and fences everywhere so the options to take decent photos, including crossing to the other side which is completely blocked off, makes it all very restricting indeed.

As someone who battles mental health problems daily, it wasn’t easy being there on my own and my anxiety was very high but it is a nice pub to go to and I am glad I went.  I hope to take some better photos one day, however, at £5.50 for a pint of lager shandy, I won’t be going there that often!

The Old Crown In Digbeth Photos (Part 2)

Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker

The interior of The Old Crown in Digbeth. Taken on 09/09/23.

Image © Frank Parker

The Old Crown well in The Old Crown in Digbeth.  Taken on 09/09/23.

Image © Frank Parker

The History of The Old Crown sign in The Old Crown in Digbeth.  Taken on 09/09/23.

Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker

History of The Old Crown in Digbeth.  Taken on 09/09/23.

Blog Posts

Notes And Links

Birmingham: The Old Crown In Digbeth Photos (Part 1)

Image © Frank Parker

On Monday the 11th of September, 2023,  I visited the Old Crown in High Street, Digbeth, Birmingham, for the first time in my life, as part of Birmingham Heritage Week.  

Being a Brummie, born and bred, I have passed this pub a lot of times, especially as I got older and I always wondered what it would be like inside.  Although as an adult I could have popped in at any time I never got around to it until now.   I am pleased I saw, as part of Heritage Week, that this Medieval pub was presenting an exhibition on the 655-year history of Birmingham’s oldest pub.  It included never-before-seen photos and illustrations of the Grade-II* listed venue, as well as giving away a booklet by Carl Chinn.

I couldn’t really look around and appreciate how historic it is as much as I would have liked and take better shots inside of the old features because it was packed (and noisy) but I managed to take some decent enough photos to share.  Sadly, and bloody annoyingly, 19 photos didn’t turn out at all.   It had been a long day for me, coming from Edgbaston after doing a lot of walking around Cannon Hill Park (also another Heritage Week event) and it was a very hot day so my phone was heating up, on charge and playing up by now so that would explain that mystery.  It is just my usual bad luck but that’s a subject for another day!

I would have liked to have taken better ones outside too but there are seemingly never-ending roadworks going on and fences everywhere so the options to take decent photos, including crossing to the other side which is completely blocked off, makes it all very restricting indeed.

As someone who battles mental health problems daily, it wasn’t easy being there on my own and my anxiety was very high but it is a nice pub to go to and I am glad I went.  I hope to take some better photos one day, however, at £5.50 for a pint of lager shandy, I won’t be going there that often!

The Old Crown In Digbeth Photos (Part 1)

Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Looking towards the side of The Old Crown in Digbeth. Taken on 09/09/23.
Image © Frank Parker

Looking towards The Old Crown in Digbeth. Taken on 09/09/23.

Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker

The Old Crown in Digbeth. Taken on 09/09/23.

Image © Frank Parker
Image © Frank Parker

Looking towards the side of The Old Crown in Digbeth. Taken on 09/09/23. 

Image © Frank Parker

The Old Crown sign at The Old Crown in Digbeth. Taken on 09/09/23.

Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker
Image © Frank Parker

The interior of The Old Crown in Digbeth. Taken on 09/09/23. 

Blog Posts

Notes And Links

 

Birmingham: The Old Crown In Digbeth

Image © Frank Parker

On Monday the 11th of September, 2023,  I visited the Old Crown in High Street, Digbeth, Birmingham, for the first time in my life, as part of Birmingham Heritage Week.  The pub was celebrating its 165th birthday over the weekend.

Being a Brummie, born and bred, I have passed this pub a lot of times, especially as I got older and I always wondered what it would be like inside.  Although as an adult I could have popped in at any time I never got around to it until now.   I am pleased I saw, as part of Heritage Week, that this Medieval pub was presenting an exhibition on the 655-year history of Birmingham’s oldest pub.  It included never-before-seen photos and illustrations of the Grade-II* listed venue, as well as giving away a booklet by Carl Chinn. 

I couldn’t really look around and appreciate how historic it is as much as I would have liked and take better shots inside of the old features because it was packed (and noisy) but I managed to take some decent enough photos to share.  Sadly, and bloody annoyingly, 19 photos didn’t turn out at all.   It had been a long day for me, coming from Edgbaston after doing a lot of walking around Cannon Hill Park (also another Heritage Week event) and it was a very hot day so my phone was heating up, on charge and playing up by now so that would explain that mystery.  It is just my usual bad luck but that’s a subject for another day!

I would have liked to have taken better ones outside too but there are seemingly never-ending roadworks going on and fences everywhere so the options to take decent photos, including crossing to the other side which is completely blocked off, makes it all very restricting indeed.

As someone who battles mental health problems daily, it wasn’t easy being there on my own and my anxiety was very high but it is a nice pub to go to and I am glad I went.  I hope to take some better photos one day, however, at £5.50 for a pint of lager shandy, I won’t be going there that often!

 

Photos Of The Old Crown, Digbeth

Click here to see photographic memories of my time there.

About The Old Crown, Digbeth

The Old Crown pub is in Deritend and is a Grade II* listed building retaining its black and white timber frame.  Almost all of the present building dates from the early 16th century. 

The Old Crown is Birmingham’s oldest secular building and has existed since 1368. 

It is Birmingham’s oldest inn with Queen Elizabeth I staying here in 1575 on her way home from Kenilworth Castle. 

Rooms are individually decorated with a mix of en-suite and shared bathrooms.  Facilities include TV, tea and coffee, towels and free wifi.   

The pub has a restaurant and there are various local eateries a short walk away and award-winning purveyors of street food, Digbeth Dining Club, takes place just two minutes away.  The Old Crown is situated a 10-minute walk from the city centre and has many local attractions within easy reach.

Having stood the test of time during the English Civil War the pub & events garden now stands proud in the heart of Digbeth, Birmingham’s thriving creative quarter.

History Of The Old Crown

It is believed the building was constructed between 1450 and 1500 with some evidence dating to 1492 (the same year the Saracen’s Head in nearby Kings Norton was completed).  John Leland visited the town during his tours of England and Wales upon entering Birmingham, in 1538 noted the building, as a “mansion house of tymber”.   It is thought to have been originally built as the Guildhall and School of St. John, Deritend.  This Guild owned a number of other buildings throughout Warwickshire, including the Guildhall in Henley in Arden.  The building was purchased in 1589, by John Dyckson, alias Bayleys who, in the 1580’s, had been buying a number of properties and lands in Deritend and in Bordesley.

Described as a tenement and garden, running alongside Heath Mill Lane, the building remained in the Dixon alias Baylis (later Dixon) family for the next hundred years.

In the original deed, John Dyckson is described as a Caryer, which in the West Midlands at this time, when roads were nothing more than hollow-ways and bridle paths, implied that he owned several trains of pack-horses.  These would have needed stabling, and Dixon would have needed warehouse space to store goods awaiting dispatch and arrived goods awaiting collection.  Such facilities would be useful to other travellers, and it may well be that the use of the house as an Inn, dates from this time.  Indeed, since England was in the grip of a patriotic pother over the failed Armada the previous year, it would have been opportune to adopt the name The Crown.  However, the earliest documentary evidence of the building’s use as an Inn is from 1626.  In a marriage settlement dated the 21st of December, 1666 it was noted by the sign of the Crowne

Heated skirmishes were fought around the building when Prince Rupert’s forces raided Birmingham during the English Civil War.

The building was converted into two houses in 1684 and then converted into three houses in 1693.  It remained three houses until the 19th century.   In 1851, Joshua Toulmin Smith saved the Old Crown from demolition when the Corporation proposed demolishing the building in order to improve the street.   Again in 1856 and 1862, the Corporation proposed to demolish the building and Smith saved the building each time.

In 1991 a local pub company owned by the Brennan family bought the Old Crown.  In the summer of 1994, Pat Brennan and his youngest son, Peter, were doing repairs and clearing out the old sheds to the rear of the property when they found the old well, which had been closed off for more than 100 years.  Now restored, it is situated at the rear entrance of the pub.   At the end of May 1998, under the guidance of Pat and Ellen Brennan and their sons Patrick, Gary and Peter, after the family’s £2 million investment into Birmingham’s most famous hostelry, The Old Crown was restored to its former glory and reopened.

Image © Frank Parker

The History of The Old Crown sign inside the pub.

Image © Frank Parker

Looking towards The Old Crown, Digbeth, Birmingham.  Taken on 09/09/23.

Image © Oosoom via Wikipedia

The Old Crown in 2006.

Construction Of The Old Crown

The building is 71 feet, 4 inches (21.74 meters) wide and 20 feet, 2 inches (6.15 meters) deep on the ground floor.  On the first floor, which overhangs the front, it is 21 feet, 9 inches (6.63 meters) deep.   When built, the original building had a central hall with a length of 40 feet (12 meters) and a width of 20 feet (6 meters).  Below this were a number of arched cellars. On the upper floor were just four rooms.  The building had a courtyard to its rear which contained a well.  It was 26 feet (8 meters) deep and surrounded by large stones.  The well was excavated and deepened to produce a total depth of 38 feet (12 meters).  The new section of the well was lined with square bricks.  At the top, it was 2 feet 7 inches, (787.4 millimetres) at its narrowest diameter and 2 feet, 9 in (838.2 millimetres) at its widest diameter.  It widened to around 4 feet (1.2 meters) at the bottom.  The well was cleaned in 1863 and Smith added an iron gate to the top of it to preserve it whilst keeping it accessible.

Image © Frank Parker

The Old Crown well.  Taken on 09/09/23.

Read more about The Old Crown here.

The above article is
sourced from The Old Crown website in the About The Old Crown section.  The rest is from  Wikipedia and is subject to change. 

Opening Times

Monday to Thursday: 12:00 p.m. to 23:30 p.m.

Food service until 9:00 p.m.

Friday to Saturday: 12:00 p.m. to 0:30 a.m.

Food service until 9:00 p.m.

Sunday: 12:00 p.m. to 23:00 p.m.

Food service until 5:45 p.m.

Bookings

The Old Crown, winner of the Best Traditional Pub at the 2019 Midland Food and Drink Hospitality Awards, has 10 bedrooms and 1 apartment available to book. 

Bookings are not compulsory but highly recommended, especially for weekends, due to how busy they are.  They always hold some space for walk-ins, so please feel free to come down even if your preferred date is full, and they will do their best to seat you.

Some dates will show as unavailable in their booking calendar due to events that are yet to be released.  

Although they do not have on-site parking, there are a number of local car parks (2 located on the High Street, visible from the hotel), feel free to enquire with them for more details or directions.

To book a room or send them an enquiry via e-mail click here.

Sign up for their newsletter and be the first to find out about these events when they are announced.

Address

High St

Deritend

Birmingham

B12 0LD

Telephone

0121 248 1368.

E-Mail

Blog Posts

Links

Images on this page of The Old Crown are the copyright of Frank Parker unless otherwise stated.

The image above of The Old Crown in 2006 is the copyright of Wikipedia user Oosoom.   It comes with a Creative Commons licence (CC BY-SA 3.0). 

The Old Crown – Official website.

The Old Crown on Facebook.

The Old Crown on Twitter.

The Old Crown on Instagram.

The Old Crown on YouTube.

Creative Commons – Official website.  They offer better sharing, advancing universal access to knowledge and culture, and fostering creativity, innovation, and collaboration. 

 

Films

Image © of Bence Szemerey via Pixabay

Everyone loves watching a good film albeit at the cinema or at home on the television etc.  With a collection of over 1000 DVDs that includes a LOT of films, it is clear to see that this is another big passion of mine.

I can’t remember the very first film I went to see at the pictures but it was in the mid-1970’s and it possibly could have been the Disney animation adaptation of Robin Hood.  Visits to the cinema over the decades as a child and older, with family have always held special memories for me. 

Watching a film on the telly is always good but nothing beats the experience and sound quality of watching it on the big screen.  Having a home cinema has always been a dream of mine but that probably won’t ever happen but one day I would like to get a decent surround sound system and projector with a large screen or a large telly to watch films on.  I will say never say never on that one!

I like most film genres with my favourite being Horror and Science Fiction ones.  I have favourite actors and actresses the same as anyone else does and they will be shown on this page.  I am not going to list every film I have watched in my lifetime, that would be IMPOSSIBLE to remember but I will list films I have watched and enjoyed that I think are worth watching for someone else but of course, your opinions may differ from mine, that’s life.

About Film

A film, also called a movie, motion picture, moving picture, picture, photoplay, or flick is a work of visual art that simulates experiences and otherwise communicates ideas, stories, perceptions, feelings, beauty, or atmosphere through the use of moving images. Flick is, in general, a slang term, first recorded in 1926.  It originates in the verb flicker, owing to the flickering appearance of early films.  These images are generally accompanied by sound and, more rarely, other sensory stimulations.   The word cinema, short for cinematography, is often used to refer to filmmaking and the film industry, and the art form that is the result of it. 

The History Of Film

Precursors

The art of film has drawn on several earlier traditions in fields such as oral storytelling, literature, theatre, and visual arts.  Forms of art and entertainment that have already featured moving or projected images include shadowgraphy (probably used since prehistoric times), camera obscura (a natural phenomenon that has possibly been used as an artistic aid since prehistoric times), shadow puppetry (possibly originated around 200 BCE in Central Asia, India, Indonesia or China) and the magic lantern (developed in the 1650’s,  this multi-media phantasmagoria shows that magic lanterns were popular from 1790 throughout the first half of the 19th century and could feature mechanical slides, rear projection, mobile projectors, superimposition, dissolving views, live actors, smoke that was sometimes used to project images upon, odours, sounds, and even electric shocks).

Before Celluloid

The stroboscopic animation principle was introduced in 1833 with the stroboscopic disc (better known as the phenakisticope) and later applied in the zoetrope (since 1866), the flip book (since 1868), and the praxinoscope (since 1877) before it became the basic principle for cinematography.

Image © of Simon Ritter von Stampfer via Wikipedia

Prof. Stampfer’s Stroboscopische Scheibe No. X., created on the 22nd of June, 1833.  This is side Nr. 10 of the reworked second series of Stampfer’s stroboscopic disc published by Trentsensky & Vieweg in the same year.

Experiments with early phenakisticope-based animation projectors were made at least as early as 1843 and publicly screened in 1847.  Jules Duboscq marketed phenakisticope projection systems in France from circa 1853 until the 1890’s.

Photography was introduced in 1839, but initially, photographic emulsions needed such long exposures that the recording of moving subjects seemed impossible.  At least as early as 1844, a photographic series of subjects posed in different positions was created to either suggest a motion sequence or document a range of different viewing angles.  The advent of stereoscopic photography, with early experiments in the 1840’s and commercial success since the early 1850’s, raised interest in completing the photographic medium with the addition of means to capture colour and motion.  In 1849, Joseph Plateau published the idea to combine his invention of the phenakisticope with the stereoscope, as suggested to him by stereoscope inventor Charles Wheatstone, and to use photographs of plaster sculptures in different positions to be animated in the combined device.  In 1852, Jules Duboscq patented such an instrument as the Stereoscope-fantascope, ou Bioscope, but he only marketed it very briefly, without success.  One Bioscope disc with stereoscopic photographs of a machine is in the Plateau collection of Ghent University, but no instruments or other discs have yet been found.

By the late 1850’s the first examples of instantaneous photography came about and provided hope that motion photography would soon be possible, but it took a few decades before it was successfully combined with a method to record a series of sequential images in real-time.  In 1878, Eadweard Muybridge eventually managed to take a series of photographs of a running horse with a battery of cameras in a line along the track and published the results as The Horse in Motion on cabinet cards.  Muybridge, as well as Etienne-Jules Marey, Ottomar Anschütz, and many others, would create many more chronophotography studies.  Muybridge had the contours of dozens of his chronophotographic series traced onto glass discs and projected them with his zoopraxiscope in his lectures from 1880 to 1895.

Image © of Eadweard Muybridge via Wikipedia

An animation of the retouched Sallie Garner card from The Horse in Motion series by Eadweard Muybridge.  The series was from 1878 – 1879. 

Anschutz made his first instantaneous photographs in 1881.  He developed a portable camera that allowed shutter speeds as short as 1/1000 of a second in 1882.  The quality of his pictures was generally regarded to be much higher than that of the chronophotography works of Muybridge and Etienne-Jules Marey.  In 1886, Anschutz developed the Electrotachyscope, an early device that displayed short motion picture loops with 24 glass plate photographs on a 1.5-meter-wide rotating wheel that was hand-cranked to the speed of circa 30 frames per second.  Different versions were shown at many international exhibitions, fairs, conventions, and arcades from 1887 until at least 1894.  Starting in 1891, some 152 examples of a coin-operated peep-box Electrotachyscope model were manufactured by Siemens & Halske in Berlin and sold internationally.  Nearly 34,000 people paid to see it at the Berlin Exhibition Park in the summer of 1892.  Others saw it in London or at the 1893 Chicago World’s Fair.  On the 25th of November 1894, Anschutz introduced an Electrotachyscope projector with a 6 x 8 meter screening in Berlin.  Between the 22nd of February and the 30th of March 1895, a total of circa 7,000 paying customers came to view a 1.5-hour show of some 40 scenes at a 300-seat hall in the old Reichstag building in Berlin.

Image © unknown via Wikipedia

A picture of Ottomar’s Anschutz’s electrotachyscope, first published in Scientific American on the 16th of November, 1889.

Emile Reynaud already mentioned the possibility of projecting the images of the Praxinoscope in his 1877 patent application.  He presented a praxinoscope projection device at the Societe Francaise de Photographie on the 4th of June 1880 but did not market his praxinoscope before 1882.  He then further developed the device into the Theatre Optique which could project longer sequences with separate backgrounds, patented in 1888.  He created several movies for the machine by painting images on hundreds of gelatin plates that were mounted into cardboard frames and attached to a cloth band.  From the 28th of October 1892 to March 1900 Reynaud gave over 12,800 shows to a total of over 500,000 visitors at the Grevin Museu in Paris.

First Motion Pictures

By the end of the 1880’s, the introduction of lengths of celluloid photographic film and the invention of motion picture cameras, which could photograph a rapid sequence of images using only one lens, allowed the action to be captured and stored on a single compact reel of film.

Movies were initially shown publicly to one person at a time through peep show devices such as the Electrotachyscope, Kinetoscope, and the Mutoscope.  Not much later, exhibitors managed to project films on large screens for theatre audiences.

The first public screenings of films at which admission was charged were made in 1895 by the American Woodville Latham and his sons, using films produced by their Eidoloscope company, by the Skladanowsky brothers, and by the arguably better known  French brothers Auguste and Louis Lumiere with ten of their own productions.  Private screenings had preceded these by several months, with Latham’s slightly predating the others.

Roundhay Garden Scene is a short silent motion picture filmed by French inventor Louis Le Prince at Oakwood Grange in Roundhay, Leeds, in northern England on the 14th of October 1888.

Pauvre Pierrot or Poor Pete as it is known in English is a French short animated film directed by Charles-Emile Reynaud in 1891 and was released in 1892. 

Georges Melies’ Le Voyage dans la Lune or A Trip to the Moon as it is known in English is an early narrative film and also an early science fiction film, released in 1902.

The Bond is a two-reel propaganda film created by Charlie Chaplin at his own expense for the Liberty Loan Committee to help sell U.S. Liberty Bonds during World War I, released in 1918. 

Early Evolution

The earliest films were simply one static shot that showed an event or action with no editing or other cinematic techniques.  Typical films showed employees leaving a factory gate, people walking in the street, and the view from the front of a trolley as it traveled a city’s Main Street.  According to legend, when a film showed a locomotive at high speed approaching the audience, the audience panicked and ran from the theater.  Around the turn of the 20th century, films started stringing several scenes together to tell a story.  The filmmakers who first put several shots or scenes discovered that, when one shot follows another, that act establishes a relationship between the content in the separate shots in the minds of the viewer.  It is this relationship that makes all film storytelling possible.  In a simple example, if a person is shown looking out a window, whatever the next shot shows, it will be regarded as the view the person was seeing.  Each scene was a single stationary shot with the action occurring before it.  The scenes were later broken up into multiple shots photographed from different distances and angles.  Other techniques such as camera movement were developed as effective ways to tell a story with film.  Until sound film became commercially practical in the late 1920’s, motion pictures were purely visual art, but these innovative silent films had gained a hold on the public imagination.  Rather than leave audiences with only the noise of the projector as an accompaniment, theater owners hired a pianist or organist or, in large urban theaters, a full orchestra to play music that fit the mood of the film at any given moment. By the early 1920’s, most films came with a prepared list of sheet music to be used for this purpose, and complete film scores were composed for major productions.

The rise of European cinema was interrupted by the outbreak of World War I, while the film industry in the United States flourished with the rise of Hollywood, typified most prominently by the innovative work of D. W. Griffith in The Birth of a Nation (1915) and Intolerance (1916).  However, in the 1920’s, European filmmakers such as Eisenstein, F. W. Murnau, and Fritz Lang, in many ways inspired by the meteoric wartime progress of film through Griffith, along with the contributions of Charles Chaplin, Buster Keaton, and others, quickly caught up with American film-making and continued to further advance the medium.

Sound

In the 1920’s, the development of electronic sound recording technologies made it practical to incorporate a soundtrack of speech, music, and sound effects synchronized with the action on the screen.  The resulting sound films were initially distinguished from the usual silent moving pictures or movies by calling them talking pictures or talkies.  The revolution they wrought was swift.  By 1930, silent film was practically extinct in the US and already being referred to as the old medium.

The evolution of sound in cinema began with the idea of combining moving images with existing phonograph sound technology.  Early sound-film systems, such as Thomas Edison’s Kinetoscope and the Vitaphone used by Warner Bros., laid the groundwork for synchronized sound in film.  The Vitaphone system, produced alongside Bell Telephone Company and Western Electric, faced initial resistance due to expensive equipping costs, but sound in cinema gained acceptance with movies like Don Juan (1926) and The Jazz Singer (1927).

American film studios, while Europe standardized on Tobis-Klangfilm and Tri-Ergon systems.  This new technology allowed for greater fluidity in film, giving rise to more complex and epic movies like King Kong (1933).

As the television threat emerged in the 1940’s and 1950’s, the film industry needed to innovate to attract audiences.  In terms of sound technology, this meant the development of surround sound and more sophisticated audio systems, such as Cinerama’s seven-channel system.  However, these advances required a large number of personnel to operate the equipment and maintain the sound experience in cinemas.

In 1966, Dolby Laboratories introduced the Dolby A noise reduction system, which became a standard in the recording industry and eliminated the hissing sound associated with earlier standardization efforts.  Dolby Stereo, a revolutionary surround sound system, followed and allowed cinema designers to take acoustics into consideration when designing cinemas.  This innovation enabled audiences in smaller venues to enjoy comparable audio experiences to those in larger city cinemas.

Today, the future of sound in film remains uncertain, with potential influences from artificial intelligence, remastered audio, and personal viewing experiences shaping its development.  However, it is clear that the evolution of sound in cinema has been marked by continuous innovation and a desire to create more immersive and engaging experiences for audiences.

Colour

A significant technological advancement in the film industry was the introduction of natural colour, where colour was captured directly from nature through photography, as opposed to being manually added to black-and-white prints using techniques like hand-coloring or stencil-coloring.  Early colour processes often produced colours that appeared far from natural.  Unlike the rapid transition from silent films to sound films, colour’s replacement of black-and-white happened more gradually.

The crucial innovation was the three-strip version of the Technicolor process, first used in animated cartoons in 1932.  The process was later applied to live-action short films, specific sequences in feature films, and finally, to an entire feature film, Becky Sharp, in 1935.  Although the process was expensive, the positive public response, as evidenced by increased box office revenue, generally justified the additional cost.  Consequently, the number of films made in color gradually increased year after year.

The 1950’s: The Growing Influence Of Television

In the early 1950’s, the proliferation of black-and-white television started seriously depressing North American cinema attendance.  In an attempt to lure audiences back into cinemas, bigger screens were installed, widescreen processes, polarised 3D projection, and stereophonic sound were introduced, and more films were made in colour, which soon became the rule rather than the exception.  Some important mainstream Hollywood films were still being made in black-and-white as late as the mid-1960’s, but they marked the end of an era.  Colour television receivers had been available in the U.S. since the mid-1950’s, but at first, they were very expensive and few broadcasts were in colour.  During the 1960’s, prices gradually came down, colour broadcasts became common, and sales boomed.  The overwhelming public verdict in favour of colour was clear.  After the final flurry of black-and-white films had been released in mid-decade, all Hollywood studio productions were filmed in colour, with the usual exceptions made only at the insistence of star filmmakers such as Peter Bogdanovich and Martin Scorsese.

The 1960’s And Later

The decades following the decline of the studio system in the 1960’s saw changes in the production and style of film.  Various New Wave movements (including the French New Wave, New German Cinema wave, Indian New Wave, Japanese New Wave, New Hollywood, and Egyptian New Wave) and the rise of film-school-educated independent filmmakers contributed to the changes the medium experienced in the latter half of the 20th century.  Digital technology has been the driving force for change throughout the 1990’s and into the 2000’s.  Digital 3D projection largely replaced earlier problem-prone 3D film systems and has become popular in the early 2010’s. 

Image © unknown via Wikipedia

Salah Zulfikar, one of the most popular actors in the golden age of Egyptian Cinema.

Etymology And Alternative Terms

The name film originally referred to the thin layer of photochemical emulsion on the celluloid strip that used to be the actual medium for recording and displaying motion pictures.

The most common term in Europe is film while in the United States,  movie is preferred.

Archaic terms include animated pictures and animated photography. Common terms for the field, in general, include the big screen, the silver screen, the movies, and cinema.  The last of these is commonly used, as an overarching term, in scholarly texts and critical essays.  In the early years, the word sheet was sometimes used instead of screen.

Recording And Transmission Of The Film

The moving images of a film are created by photographing actual scenes with a motion-picture camera, by photographing drawings or miniature models using traditional animation techniques, by means of C.G.I. and computer animation, or by a combination of some or all of these techniques, and other visual effects.

Before the introduction of digital production, a series of still images were recorded on a strip of chemically sensitised celluloid (photographic film stock), usually at a rate of 24 frames per second.  The images are transmitted through a movie projector at the same rate as they were recorded, with a Geneva drive ensuring that each frame remains still during its short projection time.  A rotating shutter causes stroboscopic intervals of darkness, but the viewer does not notice the interruptions due to flicker fusion.  The apparent motion on the screen is the result of the fact that the visual sense cannot discern the individual images at high speeds, so the impressions of the images blend with the dark intervals and are thus linked together to produce the illusion of one moving image.  An analogous optical soundtrack (a graphic recording of the spoken words, music, and other sounds) runs along a portion of the film exclusively reserved for it and is not projected.

Contemporary films are usually fully digital through the entire process of production, distribution, and exhibition.

Film Theory

Film theory seeks to develop concise and systematic concepts that apply to the study of film as art.  The concept of film as an art-form began in 1911 with Ricciotto Canudo’s manifest The Birth of the Sixth Art.  The Moscow Film School, the oldest film school in the world, was founded in 1919, in order to teach about and research film theory.  Formalist film theory, led by Rudolf Arnheim, Bela Balazs, and Siegfried Kracauer, emphasized how film differed from reality and thus could be considered a valid fine art.  Andre Bazin reacted against this theory by arguing that film’s artistic essence lay in its ability to mechanically reproduce reality, not in its differences from reality, and this gave rise to realist theory.  More recent analysis spurred by Jacques Lacan’s psychoanalysis and Ferdinand de Saussure’s semiotics among other things has given rise to psychoanalytic film theory, structuralist film theory, feminist film theory, and others.  On the other hand, critics from the analytical philosophy tradition, influenced by Wittgenstein, try to clarify misconceptions used in theoretical studies and produce analysis of a film’s vocabulary and its link to a form of life.

Image © Janke via Wikipedia

The Bolex H16 Reflex camera.

Language

Film is considered to have its own language.  James Monaco wrote a classic text on film theory, titled How to Read a Film, that addresses this.  Director Ingmar Bergman famously said, “Andrei Tarkovsky for me is the greatest director, the one who invented a new language, true to the nature of film, as it captures life as a reflection, life as a dream.”  An example of the language is a sequence of back and forth images of one speaking actor’s left profile, followed by another speaking actor’s right profile, then a repetition of this, which is a language understood by the audience to indicate a conversation.  This describes another theory of film, the 180-degree rule, as a visual story-telling device with an ability to place a viewer in a context of being psychologically present through the use of visual composition and editing.  The Hollywood style includes this narrative theory, due to the overwhelming practice of the rule by movie studios based in Hollywood, California, during film’s classical era.  Another example of cinematic language is having a shot that zooms in on the forehead of an actor with an expression of silent reflection that cuts to a shot of a younger actor who vaguely resembles the first actor, indicating that the first person is remembering a past self, an edit of compositions that causes a time transition.

Montage

Read more about Montage here.

Montage is a film editing technique in which separate pieces of film are selected, edited, and assembled to create a new section or sequence within a film.  This technique can be used to convey a narrative or to create an emotional or intellectual effect by juxtaposing different shots, often for the purpose of condensing time, space, or information.  Montage can involve flashbacks, parallel action, or the interplay of various visual elements to enhance the storytelling or create symbolic meaning.

The concept of montage emerged in the 1920’s, with pioneering Soviet filmmakers such as Sergei Eisenstein and Lev Kuleshov developing the theory of montage. Eisenstein’s film Battleship Potemkin (1925) is a prime example of the innovative use of montage, where he employed complex juxtapositions of images to create a visceral impact on the audience. 

As the art of montage evolved, filmmakers began incorporating musical and visual counterpoint to create a more dynamic and engaging experience for the viewer.  The development of scene construction through mise-en-scène, editing, and special effects led to more sophisticated techniques that can be compared to those utilized in opera and ballet.

The French New Wave movement of the late 1950’s and 1960’s also embraced the montage technique, with filmmakers such as Jean-Luc Godard and François Truffaut using montage to create distinctive and innovative films.  This approach continues to be influential in contemporary cinema, with directors employing montage to create memorable sequences in their films.

In contemporary cinema, montage continues to play an essential role in shaping narratives and creating emotional resonance.  Filmmakers have adapted the traditional montage technique to suit the evolving aesthetics and storytelling styles of modern cinema:

Rapid editing and fast-paced montages: With the advent of digital editing tools, filmmakers can now create rapid and intricate montages to convey information or emotions quickly.  Films like Darren Aronofsky’s Requiem for a Dream (2000) and Edgar Wright’s Shaun of the Dead (2004) employ fast-paced editing techniques to create immersive and intense experiences for the audience.

Music video influence: The influence of music videos on film has led to the incorporation of stylized montage sequences, often accompanied by popular music.  Films like Guardians of the Galaxy (2014) and Baby Driver (2017) use montage to create visually striking sequences that are both entertaining and narratively functional.

Sports and training montages: The sports and training montage has become a staple in modern cinema, often used to condense time and show a character’s growth or development.  Examples of this can be found in films like Rocky (1976), The Karate Kid (1984), and Million Dollar Baby (2004).

Cross-cutting and parallel action: Contemporary filmmakers often use montage to create tension and suspense by cross-cutting between parallel storylines.  Christopher Nolan’s Inception (2010) and Dunkirk (2017) employ complex cross-cutting techniques to build narrative momentum and heighten the audience’s emotional engagement.

Thematic montage: Montage can also be used to convey thematic elements or motifs in a film.  Wes Anderson’s The Royal Tenenbaums (2001) employs montage to create a visual language that reflects the film’s themes of family, nostalgia, and loss.

As the medium of film continues to evolve, montage remains an integral aspect of visual storytelling, with filmmakers finding new and innovative ways to employ this powerful technique.

Film Criticism

Film criticism is the analysis and evaluation of films.  In general, these works can be divided into two categories, academic criticism by film scholars and journalistic film criticism that appears regularly in newspapers and other media.  Film critics working for newspapers, magazines, and broadcast media mainly review new releases.  Normally they only see any given film once and have only a day or two to formulate their opinions.  Despite this, critics have an important impact on the audience response and attendance at films, especially those of certain genres.  Mass marketed action, horror, and comedy films tend not to be greatly affected by a critic’s overall judgment of a film.  The plot summary and description of a film and the assessment of the director’s and screenwriters’ work that makes up the majority of most film reviews can still have an important impact on whether people decide to see a film.  For prestige films such as most dramas and art films, the influence of reviews is important.  Poor reviews from leading critics at major papers and magazines will often reduce audience interest and attendance.

The impact of a reviewer on a given film’s box office performance is a matter of debate.  Some observers claim that movie marketing in the 2000’s is so intense, well-coordinated and well financed that reviewers cannot prevent a poorly written or filmed blockbuster from attaining market success.  However, the cataclysmic failure of some heavily promoted films which were harshly reviewed, as well as the unexpected success of critically praised independent films indicates that extreme critical reactions can have considerable influence.  Other observers note that positive film reviews have been shown to spark interest in little-known films.  Conversely, there have been several films in which film companies have so little confidence that they refuse to give reviewers an advanced viewing to avoid widespread panning of the film.  However, this usually backfires, as reviewers are wise to the tactic and warn the public that the film may not be worth seeing and the films often do poorly as a result.  Journalist film critics are sometimes called film reviewers.  Critics who take a more academic approach to films, through publishing in film journals and writing books about films using film theory or film studies approaches, study how film and filming techniques work, and what effect they have on people.  Rather than having their reviews published in newspapers or appearing on television, their articles are published in scholarly journals or up-market magazines.  They also tend to be affiliated with colleges or universities as professors or instructors.

In 1986, Roger Ebert, winner of the Pulitzer Prize for Criticism said, “If a movie can illuminate the lives of other people who share this planet with us and show us not only how different they are but, how even so, they share the same dreams and hurts, then it deserves to be called great.”

Industry

Read more about Industry here. Read more about World Cinema

The making and showing of motion pictures became a source of profit almost as soon as the process was invented.  Upon seeing how successful their new invention, and its product, was in their native France, the Lumieres quickly set about touring the Continent to exhibit the first films privately to royalty and publicly to the masses.  In each country, they would normally add new, local scenes to their catalogue and, quickly enough, found local entrepreneurs in the various countries of Europe to buy their equipment and photograph, export, import, and screen additional product commercially.  The Oberammergau Passion Play of 1898 was the first commercial motion picture ever produced.  Other pictures soon followed, and motion pictures became a separate industry that overshadowed the vaudeville world.  Dedicated theaters and companies formed specifically to produce and distribute films, while motion picture actors became major celebrities and commanded huge fees for their performances. By 1917 Charlie Chaplin had a contract that called for an annual salary of one million dollars.  From 1931 to 1956, film was also the only image storage and playback system for television programming until the introduction of videotape recorders.

In the United States, much of the film industry is centered around Hollywood, California.  Other regional centers exist in many parts of the world, such as Mumbai-centered Bollywood, the Indian film industry’s Hindi cinema which produces the largest number of films in the world.  Though the expense involved in making films has led cinema production to concentrate under the auspices of movie studios, recent advances in affordable film making equipment have allowed independent film productions to flourish.

Profit is a key force in the industry, due to the costly and risky nature of filmmaking; many films have large cost overruns, an example being Kevin Costner’s Waterworld.  Yet many filmmakers strive to create works of lasting social significance.  The Academy Awards (also known as the Oscars) are the most prominent film awards in the United States, providing recognition each year to films, based on their artistic merits (but it has got so woke lately that is questionable indeed).   There is also a large industry for educational and instructional films made in lieu of or in addition to lectures and texts.   Revenue in the industry is sometimes volatile due to the reliance on blockbuster films released in movie theaters.  The rise of alternative home entertainment has raised questions about the future of the cinema industry, and Hollywood employment has become less reliable, particularly for medium and low-budget films.

World Cinema

Read more about World Cinema here.

World cinema is a term in film theory that refers to films made outside of the American motion picture industry, particularly those in opposition to the aesthetics and values of commercial American cinema.  The Third Cinema of Latin America and various national cinemas are commonly identified as part of world cinema.  The term has been criticized for Americentrism and for ignoring the diversity of different cinematic traditions around the world.

Image © unknown via Wikipedia

Most productive cinemas around the world based on IMDb (as of 2009).  Over 10,000 titles (green), over 5,000 (yellow), over 1,000 (blue).

Associated Fields

Read more about Film theory here, Product placement here, and Propaganda here.

Derivative academic fields of study may both interact with and develop independently of filmmaking, as in film theory and analysis.  Fields of academic study have been created that are derivative or dependent on the existence of film, such as film criticism, film history, divisions of film propaganda in authoritarian governments, or psychological on subliminal effects (e.g., of a flashing soda can during a screening).  These fields may further create derivative fields, such as a movie review section in a newspaper or a television guide.  Sub-industries can spin off from film, such as popcorn makers, and film-related toys (e.g., Star Wars figures).  Sub-industries of pre-existing industries may deal specifically with film, such as product placement and other advertising within films.

Terminology

The terminology used for describing motion pictures varies considerably between British and American English.  In British usage, the name of the medium is film.  The word movie is understood but seldom used.  Additionally, the pictures (plural) is used semi-frequently to refer to the place where movies are exhibited, while in American English this may be called the movies, but it is becoming outdated.  In other countries, the place where movies are exhibited may be called a cinema or movie theatre.  By contrast, in the United States, movie is the predominant form.  Although the words film and movie are sometimes used interchangeably, film is more often used when considering artistic, theoretical, or technical aspects.  The term movies more often refers to entertainment or commercial aspects, such as where to go for a fun evening on a date.  For example, a book titled How to Understand a Film would probably be about the aesthetics or theory of film, while a book entitled Let’s Go to the Movies would probably be about the history of entertaining movies and blockbusters.

Further terminology is used to distinguish various forms and media used in the film industry.  Motion pictures and moving pictures are frequently used terms for film and movie productions specifically intended for theatrical exhibition, such as, for instance, Star Wars. DVD and videotape are video formats that can reproduce a photochemical film.  A reproduction based on such is called a transfer.  After the advent of theatrical film as an industry, the television industry began using videotape as a recording medium.  For many decades, the tape was solely an analogue medium onto which moving images could be either recorded or transferred.  Film and filming refer to the photochemical medium that chemically records a visual image and the act of recording respectively.  However, the act of shooting images with other visual media, such as with a digital camera, is still called filming and the resulting works are often called films as interchangeable with movies, despite not being shot on film.  Silent films need not be utterly silent but are films and movies without an audible dialogue, including those that have a musical accompaniment.  The word, Talkies, refers to the earliest sound films created to have audible dialogue recorded for playback along with the film, regardless of a musical accompaniment.  Cinema either broadly encompasses both films and movies, or it is roughly synonymous with film and theatrical exhibition, and both are capitalised when referring to a category of art.  The silver screen refers to the projection screen used to exhibit films and, by extension, is also used as a metonym for the entire film industry.

Widescreen refers to a larger width to height in the frame, compared to earlier historic aspect ratios.  A feature-length film, or feature film, is of a conventional full length, usually 60 minutes or more, and can commercially stand by itself without other films in a ticketed screening.  A short is a film that is not as long as a feature-length film, often screened with other shorts, or preceding a feature-length film.  An independent is a film made outside the conventional film industry.

In U.S. usage, one talks of a screening or projection of a movie or video on a screen at a public or private theatre.  In British English, a film showing happens at a cinema, never a theatre, which is a different medium and place altogether.  A cinema usually refers to an arena designed specifically to exhibit films, where the screen is affixed to a wall, while a theatre usually refers to a place where live, non-recorded action or combination thereof occurs from a podium or other type of stage, including the amphitheatre.  Theatres can still screen movies in them, though the theatre would be retrofitted to do so.  One might propose going to the cinema when referring to the activity, or sometimes to the pictures in British English, whereas the U.S. expression is usually going to the movies.  A cinema usually shows a mass-marketed movie using a front-projection screen process with either a film projector or, more recently, with a digital projector.  But, cinemas may also show theatrical movies from their home video transfers that include Blu-ray Disc, DVD, and videocassette when they possess sufficient projection quality or based upon need, such as movies that exist only in their transferred state, which may be due to the loss or deterioration of the film master and prints from which the movie originally existed.  Due to the advent of digital film production and distribution, physical film might be absent entirely.  A double feature is a screening of two independently marketed, stand-alone feature films.  A viewing is a watching of a film.  Sales and at the box office refer to tickets sold at a theatre, or more currently, rights sold for individual showings.  A release is the distribution and often simultaneous screening of a film.  A preview is a screening in advance of the main release.

Any film may also have a sequel, which portrays events following those in the film.  Bride of Frankenstein is an early example.  When there are more films than one with the same characters, story arcs, or subject themes, these movies become a series, such as the James Bond series.  And, existing outside a specific story timeline usually, does not exclude a film from being part of a series.  A film that portrays events occurring earlier in a timeline with those in another film, but is released after that film, is sometimes called a prequel, an example being Butch and Sundance: The Early Days.

The credits, or end credits, are a list that gives credit to the people involved in the production of a film.  Films from before the 1970’s usually start a film with credits, often ending with only a title card, saying The End or some equivalent, often an equivalent that depends on the language of the production.  From then onward, a film’s credits usually appear at the end of most films.  However, films with credits that end a film often repeat some credits at or near the start of a film and therefore appear twice, such as that film’s acting leads, while less frequently some appearing near or at the beginning only appear there, not at the end, which often happens to the director’s credit.  The credits appearing at or near the beginning of a film are usually called titles or beginning titles.  A post-credits scene is a scene shown after the end of the credits.  Ferris Bueller’s Day Off has a post-credit scene in which Ferris tells the audience that the film is over and they should go home.

A film’s cast refers to a collection of the actors and actresses who appear, or star, in a film.  A star is an actor or actress, often a popular one, and in many cases, a celebrity who plays a central character in a film.  Occasionally the word can also be used to refer to the fame of other members of the crew, such as a director or other personality, such as Martin Scorsese.  A crew is usually interpreted as the people involved in a film’s physical construction outside cast participation, and it could include directors, film editors, photographers, grips, gaffers, set decorators, prop masters, and costume designers.  A person can both be part of a film’s cast and crew, such as Woody Allen, who directed and starred in Take the Money and Run.

A film goer, movie goer, or film buff is a person who likes or often attends films and movies, and any of these, though more often the latter, could also see oneself as a student of films and movies.  Intense interest in films, film theory, and film criticism, is known as cinephilia.  A film enthusiast is known as a cinephile or cineaste.

Preview

Read more about Test screening here.

A preview performance refers to a showing of a film to a select audience, usually for the purposes of corporate promotions, before the public film premiere itself.  Previews are sometimes used to judge audience reaction, which if unexpectedly negative, may result in recutting or even refilming certain sections based on the audience response.  One example of a film that was changed after a negative response from the test screening is 1982’s First Blood.  After the test audience responded very negatively to the death of protagonist John Rambo (a Vietnam veteran) at the end of the film, the company wrote and re-shot a new ending in which the character survives.

Trailer And Teaser

Read more about the Film trailer here.

Trailers or previews are advertisements for films that will be shown in 1 to 3 months at a cinema.  Back in the early days of cinema, with cinemas that had only one or two screens, only certain trailers were shown for the films that were going to be shown there.  Later, when cinemas added more screens or new cinemas were built with a lot of screens, all different trailers were shown even if they were not going to play that film in that cinema.  Film studios realised that the more trailers that were shown (even if it was not going to be shown in that particular cinema) the more patrons would go to a different cinema to see the film when it came out.  The term trailer comes from their having originally been shown at the end of a film.  That practice did not last long because patrons tended to leave the theatre after the films ended, but the name stuck.  Trailers are now shown before the film, or when the first film in a double feature begins.  Film trailers are also common on DVD’s and Blu-ray Discs, as well as on the Internet and mobile devices.  Trailers are created to be engaging and interesting for viewers.  As a result, in the Internet era, viewers often seek out trailers to watch them.  Of the ten billion videos watched online annually in 2008, film trailers ranked third, after news and user-created videos.  Teasers are a much shorter preview or advertisement that lasts only 10 to 30 seconds.  Teasers are used to get patrons excited about a film coming out in the next six to twelve months.  Teasers may be produced even before the film production is completed.

The Role Of Film In Culture

Films are cultural artefacts created by specific cultures, facilitating intercultural dialogue.  It is considered to be an important art form that provides entertainment and historical value, often visually documenting a period of time.  The visual basis of the medium gives it a universal power of communication, often stretched further through the use of dubbing or subtitles to translate the dialogue into other languages. Just seeing a location in a film is linked to higher tourism to that location, demonstrating how powerful the suggestive nature of the medium can be.

Education And Propaganda

Read more about Educational films here and Propaganda films here.

Film is used for a range of goals, including education and propaganda due to its ability to effectively intercultural dialogue.  When the purpose is primarily educational, a film is called an educational film.  Examples are recordings of academic lectures and experiments, or a film based on a classic novel.  Film may be propaganda, in whole or in part, such as the films made by Leni Riefenstahl in Nazi Germany, U.S. war film trailers during World War II, or artistic films made under Stalin by Sergei Eisenstein.  They may also be works of political protest, as in the films of Andrzej Wajda, or more subtly, the films of Andrei Tarkovsky.  The same film may be considered educational by some, and propaganda by others as the Film is used for a range of goals, including education and propaganda due to its ability to effectively intercultural dialogue. When the purpose is primarily educational, a film is called an educational film. Examples are recordings of academic lectures and experiments, or a film based on a classic novel. Film may be propaganda, in whole or in part, such as the films made by Leni Riefenstahl in Nazi Germany, U.S. war film trailers during World War II, or artistic films made under Stalin by Sergei Eisenstein. They may also be works of political protest, as in the films of Andrzej Wajda, or more subtly, the films of Andrei Tarkovsky.  The same film may be considered educational by some, and propaganda by others as the categorisation of a film can be subjective.

Production

Read more about Filmmaking here

At its core, the means to produce a film depend on the content the filmmaker wishes to show, and the apparatus for displaying it e.g. the zoetrope merely requires a series of images on a strip of paper.  Film production can, therefore, take as little as one person with a camera, or even without a camera, as in Stan Brakhage’s 1963 film Mothlight, or thousands of actors, extras, and crew members for a live-action, feature-length epic.  The necessary steps for almost any film can be boiled down to conception, planning, execution, revision, and distribution.  The more involved the production, the more significant each of the steps becomes.  In a typical production cycle of a Hollywood-style film, these main stages are defined as development, pre-production, production, post-production and distribution.

This production cycle usually takes three years.  The first year is taken up with development.  The second year comprises preproduction and production.  The third year, post-production and distribution.  The bigger the production, the more resources it takes, and the more important financing becomes.  Most feature films are artistic works from the creators’ perspective, e.g., film directors, cinematographers, screenwriters and for-profit business entities for the production companies.

Crew

Read more about the Film crew here.

A film crew is a group of people hired by a film company, and employed during the production or photography phase, for the purpose of producing a film or motion picture.  Crew is distinguished from cast, who are the actors who appear in front of the camera or provide voices for characters in the film.  The crew interacts with but is also distinct from the production staff, consisting of producers, managers, company representatives, their assistants, and those whose primary responsibility falls in the pre-production or post-production phases, such as screenwriters and film editors.  Communication between production and crew generally passes through the director and his/her staff of assistants.  Medium-to-large crews are generally divided into departments with well-defined hierarchies and standards for interaction and cooperation between the departments.  Other than acting, the crew handles everything in the photography phase such as props and costumes, shooting, sound, electrics, i.e., lights, sets, and production special effects.  Caterers (known in the film industry as craft services) are usually not considered part of the crew.

Technology

Read more about Cinema Techniques here.

Film stock consists of transparent celluloid, acetate, or polyester base coated with an emulsion containing light-sensitive chemicals.  Cellulose nitrate was the first type of film base used to record motion pictures, but due to its flammability was eventually replaced by safer materials.  Stock widths and the film format for images on the reel have had a rich history, though most large commercial films are still shot on (and distributed to theatres) as 35 mm prints.  Originally moving picture film was shot and projected at various speeds using hand-cranked cameras and projectors; though 1000 frames per minute (162/3 frame/s) is generally cited as a standard silent speed, research indicates most films were shot between 16 frame/s and 23 frame/s and projected from 18 frame/s on up (often reels included instructions on how fast each scene should be shown).  When synchronised sound film was introduced in the late 1920’s, a constant speed was required for the sound head.  24 frames per second was chosen because it was the slowest (and thus cheapest) speed which allowed for sufficient sound quality.  The standard was set with Warner Bros.’s The Jazz Singer and their Vitaphone system in 1927.  Improvements since the late 19th century include the mechanisation of cameras which allows them to record at a consistent speed and quiet camera design thus allowing sound recorded on-set to be usable without requiring large blimps to encase the camera, the invention of more sophisticated filmstocks and lenses, allowing directors to film in increasingly dim conditions, and the development of synchronized sound, allowing sound to be recorded at exactly the same speed as its corresponding action.  The soundtrack can be recorded separately from shooting the film, but many parts of the soundtrack are usually recorded simultaneously for live-action pictures.

As a medium, film is not limited to motion pictures, since the technology developed as the basis for photography.  It can be used to present a progressive sequence of still images in the form of a slideshow. Film has also been incorporated into multimedia presentations and often has importance as primary historical documentation.  However, historic films have problems in terms of preservation and storage, and the motion picture industry is exploring many alternatives.  Most films on cellulose nitrate base have been copied onto modern safety films. Some studios save colour films through the use of separation masters which are three B&W negatives each exposed through red, green, or blue filters (essentially a reverse of the Technicolor process).  Digital methods have also been used to restore films, although their continued obsolescence cycle makes them (as of 2006) a poor choice for long-term preservation.  Film preservation of decaying film stock is a matter of concern to both film historians and archivists and to companies interested in preserving their existing products in order to make them available to future generations (and thereby increase revenue).  Preservation is generally a higher concern for nitrate and single-strip color films, due to their high decay rates; black-and-white films on safety bases and color films preserved on Technicolor imbibition prints tend to keep up much better, assuming proper handling and storage.

Some films in recent decades have been recorded using analogue video technology similar to that used in television production.  Modern digital video cameras and digital projectors are gaining ground as well.  These approaches are preferred by some film-makers, especially because footage shot with digital cinema can be evaluated and edited with non-linear editing systems (N.L.E.) without waiting for the film stock to be processed.  The migration was gradual, and as of 2005, most major motion pictures were still shot on film.

Independent

Read more about Independent film here.

Independent filmmaking often takes place outside Hollywood or other major studio systems.  An independent film (or indie film) is a film initially produced without financing or distribution from a major film studio.  Creative, business and technological reasons have all contributed to the growth of the indie film scene in the late 20th and early 21st century.  On the business side, the costs of big-budget studio films also lead to conservative choices in cast and crew.  There is a trend in Hollywood towards co-financing (over two-thirds of the films put out by Warner Bros. in 2000 were joint ventures, up from 10% in 1987).  A hopeful director is almost never given the opportunity to get a job on a big-budget studio film unless he or she has significant industry experience in film or television.  Also, the studios rarely produce films with unknown actors, particularly in lead roles.

Before the advent of digital alternatives, the cost of professional film equipment and stock was also a hurdle to being able to produce, direct, or star in a traditional studio film.  The advent of consumer camcorders in 1985, and more importantly, the arrival of high-resolution digital video in the early 1990’s, have lowered the technology barrier to film production significantly. Both production and post-production costs have been significantly lowered.  In the 2000’s, the hardware and software for post-production could be installed in a commodity-based personal computer.  Technologies such as DVD’s, FireWire connections and a wide variety of professional and consumer-grade video editing software make film-making relatively affordable.

Since the introduction of digital video D.V. technology, the means of production have become more democratised.  Filmmakers can conceivably shoot a film with a digital video camera and edit the film, create and edit the sound and music, and mix the final cut on a high-end home computer.  However, while the means of production may be democratised, financing, distribution, and marketing remain difficult to accomplish outside the traditional system.  Most independent filmmakers rely on film festivals to get their films noticed and sold for distribution.  The arrival of internet-based video websites such as YouTube and Veoh has further changed the filmmaking landscape, enabling indie filmmakers to make their films available to the public.

Image © unknown via Wikipedia

The Lumiere Brothers were among the first filmmakers.

Open Content Film

Read more about Open content film here.

An open-content film is much like an independent film, but it is produced through open collaborations.  Its source material is available under a license which is permissive enough to allow other parties to create fan fiction or derivative works, than a traditional copyright.  Like independent filmmaking, open source filmmaking takes place outside Hollywood or other major studio systems.  For example, the film Balloon was based on a real event during the Cold War.

Fan Film

Read more about Fan films here.

A fan film is a film or video inspired by a film, television program, comic book or a similar source, created by fans rather than by the source’s copyright holders or creators.  Fan filmmakers have traditionally been amateurs, but some of the most notable films have actually been produced by professional filmmakers as film school class projects or as demonstration reels.  Fan films vary tremendously in length, from short faux-teaser trailers for non-existent motion pictures to rarer full-length motion pictures.

Distribution

Read more about Film distribution here and Film release here.

Film distribution is the process through which a film is made available for viewing by an audience.  This is normally the task of a professional film distributor, who would determine the marketing strategy of the film, the media by which a film is to be exhibited or made available for viewing, and may set the release date and other matters.  The film may be exhibited directly to the public either through a cinema (historically the main way films were distributed) or television for personal home viewing including on DVD-Video or Blu-ray Disc, video-on-demand, online downloading, television programs through broadcast syndication etc.  Other ways of distributing a film include rental or personal purchase of the film in a variety of media and formats, such as VHS tape or DVD, or Internet downloading or streaming using a computer.

Animation

Read more about Animation here.

Animation is a technique in which each frame of a film is produced individually, whether generated as a computer graphic (by photographing a drawn image), or by repeatedly making small changes to a model unit (see
claymation and stop motion), and then photographing the result with a special animation camera.  When the frames are strung together and the resulting film is viewed at a speed of 16 or more frames per second, there is an illusion of continuous movement (due to the phi phenomenon).  Generating such a film is very labour-intensive and tedious, though the development of computer animation has greatly sped up the process. Because animation is very time-consuming and often very expensive to produce, the majority of animation for television and films comes from professional animation studios.  However, the field of independent animation has existed at least since the 1950’s, with animation being produced by independent studios and sometimes by a single person.  Several independent animation producers have gone on to enter the professional animation industry.

Limited animation is a way of increasing production and decreasing the costs of animation by using shortcuts in the animation process.  This method was pioneered by U.P.A. and popularized by Hanna-Barbera in the United States, and by Osamu Tezuka in Japan, and adapted by other studios as cartoons moved from movie theatres to television.  Although most animation studios are now using digital technologies in their productions, there is a specific style of animation that depends on film.  Camera-less animation, made famous by filmmakers like Norman McLaren, Len Lye, and Stan Brakhage, is painted and drawn directly onto pieces of film, and then run through a projector.

 

Image © Janke via Wikipedia

Further Information

Blog Posts

Films: Angel Studios.

Films: Sound Of Freedom.

Films: Tim Ballard.

Notes And Links

Article source: Wikipedia and is subject to change.

Bence Szemerey on Pexels
– The image shown at the top of this page is the copyright of Bence Szemerey.  You can find more great work from the photographer Bence and lots more free stock photos at Pexels.

The Prof. Stampfer’s Stroboscopische Scheibe No. X animation above is the copyright of Simon Ritter von Stampfer and is in the public domain.

The Horse In Motion animation above is the copyright of Eadweard Muybridge and is in the public domain.

The image above of a picture of Ottomar’s Anschutz’s electrotachyscope is  copyright unknown as is in the public domain.

The video above of Roundhay Garden Scene is in the public domain.  You can read more about the film by clicking here.

The video above of Pauvre Pierrot is in the public domain.  You can read more about the film by clicking here.

The video above of Le voyage dans la lune is in the public domain.  You can read more about the film by clicking here.

The video above of The Bond is in the public domain.  You can read more about the film by clicking here.

The image above of a picture of Salah Zulfikar, is  copyright unknown as is in the public domain.

The image above of The Bolex H16 Reflex camera is the copyright of Wikipedia user Janke and is in the public domain.   

The image above of The Most Productive Cinemas Around The World is copyright unknown via Wikipedia.   It comes with a Creative Commons licence (CC BY-SA 3.0).

The image above of The Lumiere Brothers is copyright unknown as is in the public domain.

The image above of an Animated Horse is the copyright of Wikipedia user Janke and is in the public domain.  It comes with a Creative Commons licence (CC BY-SA 2.5).

Creative Commons – Official website.  They offer better sharing, advancing universal access to knowledge and culture, and fostering creativity, innovation, and collaboration. 

IMDb – Official website.   IMDb is an online database of information related to films, television series, podcasts, home videos, video games, and streaming content online, including cast, production crew and personal biographies, plot summaries, trivia, ratings, and fan and critical reviews.   

IMDb on Facebook.

IMDb on Twitter.

IMDb on YouTube.

Wikipedia – Official website.  Wikipedia is a free online encyclopedia that anyone can edit in good faith. Its purpose is to benefit readers by containing information on all branches of knowledge.  Hosted by the Wikimedia Foundation, it consists of freely editable content, whose articles also have numerous links to guide readers to more information.