Email Address:
Password:

Lost your password?

This is the legacy website; please use the new website.

Vintage Radio

Performance improvements for the Hotpoint Bandmaster J35DE console radio

By Maurie Findlay, MIE Aust, VK2PW

If you are one of those who only wish to restore a radio as close as possible to the original, this article is not for you. We can understand those who strive to produce the vintage radio equivalent of the Concours d’Elegance but as we have pointed out in the past when discussing many valve radios, they often had design faults and unfortunate compromises.

OK, what was wrong with the design of the Hotpoint? At the time it was produced it would have been regarded as a great set.

The most serious fault is the attenuation of higher audio frequencies due to the tight selectivity of the intermediate frequency stage. Selectivity refers to the “sharpness” of tuning in a radio. This was common to sets manufactured by big companies and built by hobbyists in the 1940s and 1950s.

The usual practice was to have the IF (intermediate frequency) at 455kHz and one IF valve stage. Tuned transformers, each with two circuits, were used, one between the mixer and the IF amplifier and the other between the IF amplifier and the diode detector.

Radios intended for use in country areas sometimes had two IF amplifier stages and three IF transformers – a total of six circuits tuned to 455kHz.

Click for larger image
Fig.1: while some vintage radio restorers may regard this as sacrilegious, this diagram shows how the circuit can be modified to improve its performance. Specifically its audio bandwidth can be widened and the gain increased.

They were great for picking up distant stations but due to the severe attenuation of the high audio frequencies, they always sounded very “mellow”. These days we would simply regard the sound quality as muffled.

In order to appreciate why this happens, we need to look at the nature of the signal transmitted by the radio station.

Say the station is transmitting with a carrier at 1MHz (1000kHz) and it is modulated with a tone of 5kHz. Then, the station is actually transmitting three separate frequencies: 995kHz, 1000kHz and 1005kHz. If you put in a filter which passes the 1000kHz but attenuates the 995kHz and 1005kHz frequencies, they will be reproduced at a lower level.

Spectrum analysers and other sophisticated test instruments were not generally available in design laboratories in the 1940s and 1950s and many engineers were a bit hazy about the idea of sidebands. In the 1960s, single sideband (SSB) transmission became the standard for high-frequency communication circuits and designers began to realise that you could survive with one sideband only. But that’s another story.

If people wanted a wider audio response in the early years, the solution was to have a TRF (tuned radio frequency) receiver. Many of these were built by hobbyists from designs in popular magazines. They usually had three tuned circuits and two valve amplifying stages, followed by a detector.

The difficulty was that they were only suitable for areas close to strong stations. And if there were other stations close in frequency to the one you wanted, they would often break through. In other words, they had poor selectivity.

Share this Article: 

Privacy Policy  |  Advertise  |  Contact Us

Copyright © 1996-2018 Silicon Chip Publications Pty Ltd All Rights Reserved