Pages

Saturday, December 18, 2021

About Optical Computers – part 2

Working Principle of Optical Computer

The working principle of Optical Computer is similar to the conventional computer except with some portions that performs functional operations in Optical mode. Photons are generated by LED’s, lasers and a variety of other devices. They can be used for encoding the data similar to electrons.

Design and implementation of Optical transistors is currently under progress with the ultimate aim of building Optical Computer. Multi design Optical transistors are being experimented with. A ninety degree rotating, polarizing screen can effectively block a light beam. Optical transistors are also made from dielectric materials that have the potential to act as polarizers. Optical logic gates are slightly challenging, but fundamentally possible. They would involve one control and multiple beams that would provide a correct logical output. 


Fig. 6 – (a) Optical Network on Chip (b) Photonic Chip on Circuit

Electrons have one superior advantage in that, silicon channels and copper wires can be turned and electrons would follow. This effect can be emulated in Optical Chips using Plasmonic Nano particles. They are used for turning corners and continue on their path without major power loss or electron conversions.

Most parts of an Optical chip resembles any other commercially found computer chip. Electrons are deployed in the parts that transform or process information. The interconnects however, have drastic changes. These interconnects are used for information shuttling between different chip areas. Instead of electron shuttling, which might slow down when interconnects heat up, light is shuttled. This is because light can be easily contained and has an advantage of less information loss during travel.

Researchers are hoping that this swift communication process might result in the development of exascale computers i.e. computers that perform billions of calculations every second, 1000 times more processing speed than current speediest systems.

Advantages of Optical Computer

The advantages of Optical Computer are:

·         Optical computer has several major advantages of high density, small size, low junction heating, high speed, dynamically scalable and reconfigurable into smaller/ larger networks/ topologies, massive parallel computing ability and AI applications.

·         Apart from speed, Optical interconnections have several advantages. They are impervious to electromagnetic interference and are not prone to electrical short circuits.

·         They offer low-loss transmission and large bandwidth for parallel communication of several channels.

·         Optical processing of data is inexpensive and much easier than the processing done on electronic components.

·         Since photons are not charged, they do not readily interact with one another as electrons. This adds another advantage in that, light beams pass through each other in full duplex operation.

·         Optical materials have greater accessibility and storage density than magnetic materials.

Disadvantages of Optical Computer

The disadvantages of Optical Computer are:

·         Manufacturing Photonic Crystals is challenging.

·         Computation is complex as it involves interaction of multiple signals.

·         Bulky in size.

Future of Optical Computing

We can see interesting developments in lasers and lights. These are taking over the electronics in our computers. Optical technology is currently being promoted for use in parallel processing, storage area networks, Optical Data Networks, Optical Switches, Biometric and Holographic storage devices at airports.

Processors now contain light detectors and tiny lasers that facilitate data transmission through Optical Fiber. Few companies are even developing Optical Processors that use Optical Switches and laser light to do the calculations. One of the foremost promoters ‘Intel’ is creating an Integrated Silicon Photonics link that is capable of transmitting 50 Gigabytes per second of uninterrupted information.

It is speculated that future computers would come without screens where information presentation is made through a hologram, in the air, and above the keyboard. This kind of technology is being made possible by the collaboration of researchers and industrial experts. Also, Optical technology’s most practical use i.e. the ‘Optical Networking business’ is predicted to reach 3.5 billion dollars from 1 billion currently. 

From <https://electricalfundablog.com/optical-computer/>

 

Optical Computing: Solving Problems at the Speed of Light

According to Moore’s law —actually more like a forecast, formulated in 1965 by Intel co-founder Gordon Moore— the number of transistors in a microprocessor doubles about every two years, boosting the power of the chips without increasing their energy consumption. For half a century, Moore’s prescient vision has presided over the spectacular progress made in the world of computing. However, by 2015, the engineer himself predicted that we are reaching a saturation point in current technology. Today, quantum computing holds out hope for a new technological leap, but there is another option on which many are pinning their hopes: optical computing, which replaces electronics (electrons) with light (photons).

The end of Moore’s law is a natural consequence of physics: to pack more transistors into the same space they have to be shrunk down, which increases their speed while simultaneously reducing their energy consumption. The miniaturisation of silicon transistors has succeeded in breaking the 7-nanometre barrier, which used to be considered the limit, but this reduction cannot continue indefinitely. And although more powerful systems can always be obtained by increasing the number of transistors, in doing so the processing speed will decrease and the heat of the chips will rise.

THE HYBRIDIZATION OF ELECTRONICS AND OPTICS

Hence the promise of optical computing: photons move at the speed of light, faster than electrons in a wire. Optical technology is also not a newcomer to our lives: the vast global traffic on the information highways today travels on fibre optic channels, and for years we have used optical readers to burn and read our CDs, DVDs and Blu-Ray discs. However, in the guts of our systems, the photons coming through the fibre optic cable must be converted into electrons in the microchips, and in turn these electrons must be converted to photons in the optical readers, slowing down the process.

The overhead view of a new beamsplitter for silicon photonics chips that is the size of one-fiftieth the width of a human hair. Credit: Dan Hixson/University of Utah College of Engineering


Thus, it can be said that our current technology is already a hybridization of electronics and optics. “In the near-term, it is pretty clear that hybrid optical-electronic systems will dominate,” Rajesh Menon, a computer engineer at the University of Utah, tells OpenMind. “For instance, the vast majority of communications data is channelled via photons, while almost all computation and logic is performed by electrons.” And according to Menon, “there are fundamental reasons for this division of labour,” because while less energy is needed to transmit information in the form of photons, the waves associated with the electrons are smaller; that is, the higher speed of photonic devices has as its counterpart a larger size.

This is why some experts see limitations in the penetration of optics in computing. For Caroline Ross, a materials science engineer at the Massachusetts Institute of Technology (MIT), “the most important near-term application [for optics] is communications — managing the flow of optical data from fibres to electronics.” The engineer, whose research produced an optical diode that facilitates this task, tells OpenMind that “the use of light for actual data processing itself is a bit further out.”

THE LASER TRANSISTOR

But although we are still far from the 100% optical microchip —a practical system capable of computing only by using photons— advances are increasing the involvement of photonics in computers. In 2004, University of Illinois researchers Milton Feng and Nick Holonyak Jr. developed the concept of the laser transistor, which replaces one of the two electrical outputs of normal transistors with a light signal in the form of a laser, providing a higher data rate.

For example, today it is not possible to use light for internal communication between different components of a computer, due to the equipment that would be necessary to convert the electrical signal to optical and vice versa; the laser transistor would make this possible. “Similar to transistor integrated circuits, we hope the transistor laser will be [used for] electro-optical integrated circuits for optical computing,” Feng told OpenMind. The co-author of this breakthrough is betting on optical over quantum computing, since it does not require the icy temperatures at which quantum superconductors must operate.

Graduate students Junyi Wu and Curtis Wang and Professor Milton Feng found that light stimulates switching speed in the transistor laser. Credit: L. Brian Stauffer


Proof of the interest in this type of system is the intense research in this field, which includes new materials capable of supporting photon-based computing. Among the challenges still to be met in order to obtain optical chips, Menon highlights the integration density of the components in order to reduce the size, an area in which his laboratory is a pioneer, as well as a “better understanding of light-matter interactions at the nanoscale.”

Despite all this, we shouldn’t be overly confident that a photonic laptop will one day reach the hands of consumers. “We don’t expect optical computing to supplant electronic general-purpose computing in the near term,” Mo Steinman, vice president of engineering at Lightelligence, a startup from the photonics lab run by Marin Soljačić at MIT, told OpenMind.

Present and future of photonics

However, the truth is that nowadays this type of computing already has its own niches. “Application-specific photonics is already here, particularly in data centres and more recently in machine learning,” says Menon. In fact, Artificial Intelligence (AI) neural networks are being touted as one of its great applications, with the potential to achieve 10 million times greater efficiency than electronic systems. “Statistical workloads such as those employed in AI algorithms are perfectly suited for optical computing,” says Steinman.

Thus, optical computing can solve very complex network optimization problems that would take centuries for classical computers. In Japan, the NTT company is building a huge optical computer that encloses five kilometres of fibre in a box the size of a room, and will be applied to complicated power or communications networks enhancement tasks.

A photonic integrated circuit. Credit: JonathanMarks


“Looking ahead, we believe we can leverage the ecosystem created by optical telecommunications in the areas of integrated circuit design, fabrication, and packaging, and optimize for the specific operating points required by optical computing,” Steinman predicts. However, he admits that moving from a prototype to full-scale manufacturing will be a difficult challenge.

In short, there are reasons for optimism about the development of optical computing, but without overestimating its possibilities: when computer scientist Dror Feitelson published his book Optical Computing (MIT Press) in 1988, there was talk of a new field that was already beginning to reach maturity. More than 30 years later, “optical computing is still more of a promise than a mainstream technology,” the author tells OpenMind. And the challenges still to be overcome are compounded by another stumbling block: technological inertia. Feitelson recalls the warning issued in those days by IBM researcher Robert Keyes: with the enormous experience and accumulated investment in electronics that we already know, “practically any other technology would be unable to catch up.”

From <https://www.bbvaopenmind.com/en/technology/future/optical-computing-solving-problems-at-the-speed-of-light/>

 Optical computers light up the horizon

 

Optical chips will power our future datacenters and supercomputers. Electronic chips can now have a layer of optical components, like lasers and switches, added to it, to increase their computing power. Credit: Martijn Heck, Aarhus University

Since their invention, computers have become faster and faster, as a result of our ability to increase the number of transistors on a processor chip.

Today, your smartphone is millions of times faster than the computers NASA used to put the first man on the moon in 1969. It even outperforms the most famous supercomputers from the 1990s. However, we are approaching the limits of this electronic technology, and now we see an interesting development: light and lasers are taking over electronics in computers.

Processors can now contain tiny lasers and light detectors, so they can send and receive data through small optical fibres, at speeds far exceeding the copper lines we use now. A few companies are even developing optical processors: chips that use laser light and optical switches, instead of currents and electronic transistors, to do calculations.

So, let us first take a closer look at why our current technology is running out of steam. And then, of course, answer the main question: when can you buy that optical computer?

Moore's Law is dying

Computers work with ones and zeros for all their calculations and transistors are the little switches that make that happen. Current processor chips, or integrated circuits, consist of billions of transistors. In 1965, Gordon Moore, founder of Intel, predicted that the number of transistors per chip would double every two years. This became known as Moore's Law, and after more than half a century, it is still alive. Well, it appears to be alive...

In fact, we are fast reaching the end of this scaling. Transistors are now approaching the size of an atom, which means that quantum mechanical effects are becoming a bottleneck. The electrons, which make up the current, can randomly disappear from such tiny electrical components, messing up the calculations.

Moreover, the newest technology, where transistors have a size of only five nanometers, is now so complex that it might become too expensive to improve. A semiconductor fabrication plant for this five-nanometer chip technology, to be operational in 2020, has already cost a steep 17 billion US dollars to build.


Computer processor chips have plateaued

Looking more closely, however, the performance growth in transistors has been declining. Remember the past, when every few years faster computers hit the market? From 10 MHz clock speed in the 80s, to 100 MHz in the 90s and 1 GHz in 2000? That has stopped, and computers have been stuck at about 4 GHz for over 10 years.

Of course with smart chip design, for example using parallel processing in multi-core processors, we can still increase the performance, so your computer still works faster, but this increased speed is not due to the transistors themselves.

And these gains come at a cost. All those cores on the processor need to communicate with each other, to share tasks, which consumes a lot of energy. So much so that the communication on and between chips is now responsible for more than half of the total power consumption of the computer.

Since computers are everywhere, in our smartphone and laptop, but also in datacenters and the internet, this energy consumption is actually a substantial amount of our carbon footprint.

For example, there are bold estimations that intense use of a smartphone connected to the Internet consumes the same amount of energy as a fridge. Surprising, right? Do not worry about your personal electricity bill, though, as this is the energy consumed by the datacenters and networks. And the number and use of smartphones and other wearable tech keeps growing.

Fear not: lasers to the rescue

So, how can we reduce the energy consumption of our computers and make them more sustainable? The answer becomes clear when we look at the Internet.

In the past, we used electrical signals, going through copper wires, to communicate. The optical fibre, guiding laser light, has revolutionised communications, and has made the Internet what it is today: Fast and extending across the entire world. You might even have fibre all the way to your home.

We are using the same idea for the next generation computers and servers. No longer will the chips be plugged in on motherboards with copper lines, but instead we will use optical waveguides. These can guide light, just like optical fibres, and are embedded into the motherboard. Small lasers and photodiodes are then used to generate and receive the data signal. In fact, companies like Microsoft are already considering this approach for their cloud servers.

Optical chips are already a reality

Now I know what you're thinking around about now:

"But wait a second, how will these chips communicate with each other using light? Aren't they built to generate an electrical current?"

Yes, they are. Or, at least, they were. But interestingly, silicon chips can be adapted to include transmitters and receivers for light, alongside the transistors.

Researchers from the Massachusetts Institute of Technology in the US have already achieved this, and have now started a company (Ayar Labs) to commercialise the technology.

Here at Aarhus University in Denmark we are thinking even further ahead: If chips can communicate with each other optically, using laser light, would it not also make sense that the communication on a chip—between cores and transistors—would benefit from optics?

We are doing exactly that. In collaboration with partners across Europe, we are figuring out whether we can make more energy-efficient memory by writing the bits and bytes using laser light, integrated on a chip. This is very exploratory research, but if we succeed, it could change future chip technology as early as 2030.

The future: optical computers on sale in five years?

So far so good, but there is a caveat: Even though optics are superior to electronics for communication, they are not very suitable for actually carrying out calculations. At least, when we think binary—in ones and zeros.

Here the human brain may hold a solution. We do not think in a binary way. Our brain is not digital, but analogue, and it makes calculations all the time.

Computer engineers are now realising the potential of such analogues, or brain-like, computing, and have created a new field of neuromorphic computing, where they try to mimic how the human brain works using electronic chips.

And in turns out that optics are an excellent choice for this new brain-like way of computing.

The same kind of technology used by MIT and our team, at Aarhus University, to create optical communications between and on silicon chips, can also be used to make such neuromorphic optical chips.

In fact, it has already been shown that such chips can do some basic speech recognition. And two start-ups in the US, Lightelligence and Lightmatter, have now taken up the challenge to realise such optical chips for artificial intelligence.

Optical chips are still some way behind electronic chips, but we're already seeing the results and this research could lead to a complete revolution in computer power. Maybe in five years from now we will see the first optical co-processors in supercomputers. These will be used for very specific tasks, such as the discovery of new pharmaceutical drugs.

But who knows what will follow after that? In ten years these chips might be used to detect and recognise objects in self-driving cars and autonomous drones. And when you are talking to Apple's Siri or Amazon's Echo, by then you might actually be speaking to an optical computer.

While the 20th century was the age of the electron, the 21st century is the age of the photon – of light. And the future shines bright.

 

From <https://phys.org/news/2018-03-optical-horizon.html>

 

 For all discussed seminar topics list click here Index.

…till next post, bye-bye and take care.

 


Friday, December 17, 2021

About Optical Computers

Optical computing: As per Wikipedia



Optical computing or photonic computing uses photons produced by lasers or diodes for computation. For decades, photons have shown promise to enable a higher bandwidth than the electrons used in conventional computers (see optical fibers).

Most research projects focus on replacing current computer components with optical equivalents, resulting in an optical digital computer system processing binary data. This approach appears to offer the best short-term prospects for commercial optical computing, since optical components could be integrated into traditional computers to produce an optical-electronic hybrid. However, optoelectronic devices consume 30% of their energy converting electronic energy into photons and back; this conversion also slows the transmission of messages. All-optical computers eliminate the need for optical-electrical-optical (OEO) conversions, thus reducing electrical power consumption.

Application-specific devices, such as synthetic aperture radar (SAR) and optical correlators, have been designed to use the principles of optical computing. Correlators can be used, for example, to detect and track objects, and to classify serial time-domain optical data.

Optical components for binary digital computer

The fundamental building block of modern electronic computers is the transistor. To replace electronic components with optical ones, an equivalent optical transistor is required. This is achieved using materials with a non-linear refractive index. In particular, materials exist where the intensity of incoming light affects the intensity of the light transmitted through the material in a similar manner to the current response of a bipolar transistor. Such an optical transistor[5][6] can be used to create optical logic gates,[6] which in turn are assembled into the higher level components of the computer's central processing unit (CPU). These will be nonlinear optical crystals used to manipulate light beams into controlling other light beams.

Like any computing system, an optical computing system needs three things to function well:

·         optical processor

·         optical data transfer, e.g. fiber optic cable

·         optical storage

Substituting electrical components will need data format conversion from photons to electrons, which will make the system slower.

Controversy

There are some disagreements between researchers about the future capabilities of optical computers; whether or not they may be able to compete with semiconductor-based electronic computers in terms of speed, power consumption, cost, and size is an open question. Critics note that real-world logic systems require "logic-level restoration, cascadability, fan-out and input–output isolation", all of which are currently provided by electronic transistors at low cost, low power, and high speed. For optical logic to be competitive beyond a few niche applications, major breakthroughs in non-linear optical device technology would be required, or perhaps a change in the nature of computing itself.

Misconceptions, challenges, and prospects

A significant challenge to optical computing is that computation is a nonlinear process in which multiple signals must interact. Light, which is an electromagnetic wave, can only interact with another electromagnetic wave in the presence of electrons in a material,[10] and the strength of this interaction is much weaker for electromagnetic waves, such as light, than for the electronic signals in a conventional computer. This may result in the processing elements for an optical computer requiring more power and larger dimensions than those for a conventional electronic computer using transistors.

A further misconception is that since light can travel much faster than the drift velocity of electrons, and at frequencies measured in THz, optical transistors should be capable of extremely high frequencies. However, any electromagnetic wave must obey the transform limit, and therefore the rate at which an optical transistor can respond to a signal is still limited by its spectral bandwidth. However, in fiber optic communications, practical limits such as dispersion often constrain channels to bandwidths of 10s of GHz, only slightly better than many silicon transistors. Obtaining dramatically faster operation than electronic transistors would therefore require practical methods of transmitting ultrashort pulses down highly dispersive waveguides.

Photonic logic

Realization of a photonic controlled-NOT gate for use in quantum computing

Photonic logic is the use of photons (light) in logic gates (NOT, AND, OR, NAND, NOR, XOR, XNOR). Switching is obtained using nonlinear optical effects when two or more signals are combined.

Resonators are especially useful in photonic logic, since they allow a build-up of energy from constructive interference, thus enhancing optical nonlinear effects.

Other approaches that have been investigated include photonic logic at a molecular level, using photoluminescent chemicals. In a demonstration, Witlicki et al. performed logical operations using molecules and SERS.

From <https://en.wikipedia.org/wiki/Optical_computing>

What is Optical Computer?

An optical computer (also called a photonic computer) is a device that uses the photons in visible light or infrared ( IR ) beams, rather than electric current, to perform digital computations. An electric current flows at only about 10 percent of the speed of light. This limits the rate at which data can be exchanged over long distances, and is one of the factors that led to the evolution of optical fiber . By applying some of the advantages of visible and/or IR networks at the device and component scale, a computer might someday be developed that can perform operations 10 or more times faster than a conventional electronic computer.

Visible-light and IR beams, unlike electric currents, pass through each other without interacting. Several (or many) laser beams can be shone so their paths intersect, but there is no interference among the beams, even when they are confined essentially to two dimensions. Electric currents must be guided around each other, and this makes three-dimensional wiring necessary. Thus, an optical computer, besides being much faster than an electronic one, might also be smaller.

Some engineers think optical computing will someday be common, but most agree that transitions will occur in specialized areas one at a time. Some optical integrated circuits have been designed and manufactured. (At least one complete, although rather large, computer has been built using optical circuits.) Three-dimensional, full-motion video can be transmitted along a bundle of fibers by breaking the image into voxels. Some optical devices can be controlled by electronic currents, even though the impulses carrying the data are visible light or IR. Optical technology has made its most significant inroads in digital communications, where fiber optic data transmission has become commonplace. The ultimate goal is the so-called photonic network , which uses visible andIR energy exclusively between each source and destination. Optical technology is employed in CD-ROM drives and their relatives, laser printers, and most photocopiers and scanners. However, none of these devices are fully optical; all rely to some extent on conventional electronic circuits and components.

 From <https://whatis.techtarget.com/definition/optical-computer-photonic-computer>


Optical Computer – Components, Working Principle and Why We Need It

Optical Computer is indeed the computer technology of future which uses light particles called Photons. This post will discuss Optical Computer, Optical Components required for computation, why we need it, its working principle, advantages and disadvantages.

What is an Optical Computer?

A device that uses Photons or Infrared beams, instead of electric current, for its digital computations is termed as an Photonic or Optical Computer.

The flow of electric current is only 10 percent of the speed of light. This poses severe restrictions on long distance data transmission. Such restrictions resulted in the evolution of optical fiber. By applying the advantages of IR networks and/or visible light at the component and device scale, a computer (Optical Computer) can be developed that has 10 times more processing power than conventional systems.



Fig. 3 – Prototype of Optical Computer

Unlike electric current, IR beams and visible light can pass through each other without interaction. Several laser beams can be projected so as to intersect their path, but the beams will have no interference even when they are confined to two dimensions.

From <https://electricalfundablog.com/optical-computer/>

With electric currents, three dimensional wiring becomes necessary since they have to be guided around each other. Thus an Optical Computer, apart from being faster, can also be smaller. Figure 2 below shows an 8 bit or Bit-Serial Optical Computer.


Fig. 4 – Bit-Serial Optical Computer

Main Optical Components in Optical Computer

The main Optical components required for computing in an Optical Computer are:

·         VCSEL (Vertical Cavity Surface Emitting Micro Laser)

·         Spatial Light Modulators

·         Optical Logical Gates

·         Smart Pixels

VCSEL (Vertical Cavity Surface Emitting Micro Laser)

VCSEL is a semiconductor Micro Laser Diode that emits light vertically from the surface. It basically converts the Electrical Signal to Optical Signal. It is the best example of one dimensional Photonic Crystal.

Spatial Light Modulators

Spatial Light Modulators are responsible for modulating the intensity and the phase of the Optical beam. They are used in Holographic Data Storage systems as they encode the information into a laser beam.

Optical Logic Gates

An Optical Logic Gate is nothing but an Optical Switch that controls the light beams. It is said to be “ON” when the device transmits light and “OFF” when the device blocks the light.

Smart Pixels

Smart Pixels help Optical Systems with high levels of Electronic Signal Processing.

Why do we need Optical Computer?

The need for Optical Computer (s) emerged from the fact that the conventional computers are limited by the time response of electronic circuits and also the building up of heat damages the electronic components. For example: Microprocessors contain billions of transistors and sometimes they operate at clock speeds in excess of 3 billion cycles per second which implies that the transistors are exposed to lots of heat, which accelerates their chances of damage.

The other factors which adds to this need of developing a better alternate are:

·         The End of Electron Based Computing as Moore’s Law is Failing

·         A plateau in Computer Processing Chips

The End of Electron Based Computing as Moore’s Law is Failing

Computers work with zeros and ones. Little switches called transistors make this possible and there are billions of them found on current Integrated Circuits and Processor Chips. In 1965, the founder of Intel, Gordon Moore, predicted that there would be a doubling in the number of transistors on every chip, every two years. This came to be popular as Moore’s law.

This prediction was accurate up to the beginning of the 21st century. While the predicted exponential growth has not completely stopped, it has certainly slowed down. Transistors are now being manufactured in atomic sizes. This implies that there will soon be bottlenecks in the quantum mechanical effects.

Current or electrons can disappear randomly from these minute electrical components, thereby resulting in incorrect calculations. Also, the latest technology where transistors measure only five nano meters has become very complex and too expensive to advance.

A Plateau in Computer Processing Chips

A closer inspection reveals that there has been a decline in the performance of transistors. Looking back, we realize that faster computers were bombarding the market every few years. Today, however, computers are stuck at 4 GHz speed. Yet, it is possible to improve performance with smart chips and parallel processing. However, this increase in speed is attributed not only to transistors but also to various other circuitry. 


Fig. 5 – Optical Transistors in Optical Computer

All these benefits incur a cost. Processor cores need to constantly maintain communication which consumes energy. It is so high that communication between the chips is known to consume more than half of the total computing power. Since computers are in our smart phones, laptops, internet and data centers, this energy consumption leaves behind a substantial amount of carbon footprint.

For all discussed seminar topics list click here Index.

…till next post, bye-bye and take care.

Thursday, December 16, 2021

About Bluetooth Technology - part 2

Bluetooth® Wireless Technology

One key reason for the incredible success of Bluetooth® technology is the tremendous flexibility it provides developers. Offering two radio options, Bluetooth technology provides developers with a versatile set of full-stack, fit-for-purpose solutions to meet the ever-expanding needs for wireless connectivity.

Whether a product streams high-quality audio between a smartphone and speaker, transfers data between a tablet and medical device, or sends messages between thousands of nodes in a building automation solution, the Bluetooth Low Energy (LE) and Bluetooth Classic radios are designed to meet the unique needs of developers worldwide.

From <https://www.bluetooth.com/learn-about-bluetooth/tech-overview/>



Bluetooth® Classic

The Bluetooth Classic radio, also referred to as Bluetooth Basic Rate/Enhanced Data Rate (BR/EDR), is a low power radio that streams data over 79 channels in the 2.4GHz unlicensed industrial, scientific, and medical (ISM) frequency band. Supporting point-to-point device communication, Bluetooth Classic is mainly used to enable wireless audio streaming and has become the standard radio protocol behind wireless speakers, headphones, and in-car entertainment systems. The Bluetooth Classic radio also enables data transfer applications, including mobile printing.

Bluetooth® Low Energy (LE)

The Bluetooth Low Energy (LE) radio is designed for very low power operation. Transmitting data over 40 channels in the 2.4GHz unlicensed ISM frequency band, the Bluetooth LE radio provides developers a tremendous amount of flexibility to build products that meet the unique connectivity requirements of their market. Bluetooth LE supports multiple communication topologies, expanding from point-to-point to broadcast and, most recently, mesh, enabling Bluetooth technology to support the creation of reliable, large-scale device networks. While initially known for its device communications capabilities, Bluetooth LE is now also widely used as a device positioning technology to address the increasing demand for high accuracy indoor location services. Initially supporting simple presence and proximity capabilities, Bluetooth LE now supports Bluetooth® Direction Finding and soon, high-accuracy distance measurement.

 

Bluetooth Low Energy (LE)

Bluetooth Classic

Frequency Band

2.4GHz ISM Band (2.402 – 2.480 GHz Utilized)

2.4GHz ISM Band (2.402 – 2.480 GHz Utilized)

Channels

40 channels with 2 MHz spacing

(3 advertising channels/37 data channels)

79 channels with 1 MHz spacing

Channel Usage

Frequency-Hopping Spread Spectrum (FHSS)

Frequency-Hopping Spread Spectrum (FHSS)

Modulation

GFSK

GFSK, π/4 DQPSK, 8DPSK

Data Rate

LE 2M PHY: 2 Mb/s

LE 1M PHY: 1 Mb/s

LE Coded PHY (S=2): 500 Kb/s

LE Coded PHY (S=8): 125 Kb/s

EDR PHY (8DPSK): 3 Mb/s

EDR PHY (π/4 DQPSK): 2 Mb/s

BR PHY (GFSK): 1 Mb/s

Tx Power*

100 mW (+20 dBm)

100 mW (+20 dBm)

Rx Sensitivity

LE 2M PHY: -70 dBm

LE 1M PHY: -70 dBm

LE Coded PHY (S=2): -75 dBm

LE Coded PHY (S=8): -82 dBm

-70 dBm

Data Transports

Asynchronous Connection-oriented

Isochronous Connection-oriented

Asynchronous Connectionless

Synchronous Connectionless

Isochronous Connectionless

Asynchronous Connection-oriented

Synchronous Connection-oriented

Communication Topologies

Point-to-Point (including piconet)

Broadcast

Mesh

Point-to-Point (including piconet)

Positioning Features

Presence (Advertising)

Proximity (RSSI)

Direction (AoA/AoD)

Distance (Coming)

None

* Devices shall not exceed the maximum allowed transmit power levels set by the regulatory bodies that have jurisdiction over the locales in which the device is to be sold or intended to operate. Implementers should be aware that the maximum transmit power level permitted under a given set of regulations might not be the same for all modulation modes.

 

From <https://www.bluetooth.com/learn-about-bluetooth/tech-overview/>

 

How does Bluetooth work?

November 5, 2007

We go straight to the source and get Bluetooth executive director Michael Foley to wirelessly transmit an answer to this query.

Bluetooth technology is a short-range wireless communications technology to replace the cables connecting electronic devices, allowing a person to have a phone conversation via a headset, use a wireless mouse and synchronize information from a mobile phone to a PC, all using the same core system.

The Bluetooth RF transceiver (or physical layer) operates in the unlicensed ISM band centered at 2.4 gigahertz (the same range of frequencies used by microwaves and Wi-Fi). The core system employs a frequency-hopping transceiver to combat interference and fading.

Bluetooth devices are managed using an RF topology known as a "star topology." A group of devices synchronized in this fashion forms a piconet, which may contain one master and up to seven active slaves, with additional slaves that are not actively participating in the network. (A given device may also be part of one or more piconets, either as a master or as a slave.) In a piconet, the physical radio channel is shared by a group of devices that are synchronized to a common clock and frequency-hopping pattern, with the master device providing the synchronization references.

Let's say the master device is your mobile phone. All of the other devices in your piconet are known as slaves. This could include your headset, GPS receiver, MP3 player, car stereo, and so on.



Devices in a piconet use a specific frequency-hopping pattern, which is algorithmically determined by the master device. The basic hopping pattern is a pseudorandom ordering of the 79 frequencies in the ISM band. The hopping pattern may be adapted to exclude a portion of the frequencies that are used by interfering devices. The adaptive hopping technique improves Bluetooth technology's coexistence with static (nonhopping) ISM systems, such as Wi-Fi networks, when these are located in the vicinity of a piconet.

The physical channel (or the wireless link) is subdivided into time units known as slots. Data is transmitted between Bluetooth-enabled devices in packets that are positioned in these slots. Frequency hopping takes place between the transmission or reception of packets, so the packets that make up one transmission may be sent over different frequencies within the ISM band.

The physical channel is also used as a transport for one or more logical links that support synchronous and asynchronous traffic as well as broadcast traffic. Each type of link has a specific use. For instance, synchronous traffic is used to carry hands-free audio data, while asynchronous traffic may carry other forms of data that can withstand more variability in the timing for delivery, such as printing a file or synchronizing your calendar between your phone and computer.

One of the complexities often associated with wireless technology is the process of connecting wireless devices. Users have become accustomed to the process of connecting wired devices by plugging one end of a cable into one device and the other end into the complementary device.

Bluetooth technology uses the principles of device "inquiry" and "inquiry scan." Scanning devices listen in on known frequencies for devices that are actively inquiring. When an inquiry is received, the scanning device sends a response with the information needed for the inquiring device to determine and display the nature of the device that has recognized its signal.

Let's say you want to wirelessly print a picture from your mobile phone to a nearby printer. In this case, you go to the picture on your phone and select print as an option for sending that picture. The phone would begin searching for devices in the area. The printer (the scanning device) would respond to the inquiry and, as a result, would appear on the phone as an available printing device. By responding, the printer is ready to accept the connection. When you select the Bluetooth wireless printer, the printing process kicks off by establishing connections at successively higher layers of the Bluetooth protocol stack that, in this case, control the printing function.

Like any successful technology, all of this complexity goes on without the user being aware of anything more than the task he or she is trying to complete, like connecting devices and talking hands-free or listening to high-quality stereo music on wireless headphones.

 

From <https://www.scientificamerican.com/article/experts-how-does-bluetooth-work/>

 

What Bluetooth is, what it does, and how it works

Bluetooth is a short-range wireless communication technology that allows devices such as mobile phones, computers, and peripherals to transmit data or voice wirelessly over a short distance. The purpose of Bluetooth is to replace the cables that normally connect devices, while still keeping the communications between them secure.

The "Bluetooth" name is taken from a 10th-century Danish king named Harald Bluetooth, who was said to unite disparate, warring regional factions. Like its namesake, Bluetooth technology brings together a broad range of devices across many different industries through a unifying communication standard. 

Bluetooth Technology

Developed in 1994, Bluetooth was intended as a wireless replacement for cables. It uses the same 2.4GHz frequency as some other wireless technologies in the home or office, such as cordless phones and WiFi routers. It creates a 10-meter (33-foot) radius wireless network, called a personal area network (PAN) or piconet, which can network between two and eight devices. This short-range network allows you to send a page to your printer in another room, for example, without having to run an unsightly cable.

Bluetooth uses less power and costs less to implement than Wi-Fi. Its lower power also makes it far less prone to suffering from or causing interference with other wireless devices in the same 2.4GHz radio band. 

Bluetooth range and transmission speeds are typically lower than Wi-Fi (the wireless local area network that you may have in your home). Bluetooth v3.0 + HS — Bluetooth high-speed technology — devices can deliver up to 24 Mbps of data, which is faster than the 802.11b WiFi standard, but slower than wireless-a or wireless-g standards. As technology has evolved, however, Bluetooth speeds have increased.

The Bluetooth 4.0 specification was officially adopted on July 6, 2010. Bluetooth version 4.0 features include low energy consumption, low cost, multivendor interoperability, and enhanced range. 

The hallmark feature enhancement to the Bluetooth 4.0 spec is its lower power requirements; devices using Bluetooth v4.0 are optimized for low battery operation and can run off of small coin-cell batteries, opening up new opportunities for wireless technology. Instead of fearing that leaving Bluetooth on will drain your cell phone's battery, for example, you can leave a Bluetooth v4.0 mobile phone connected all the time to your other Bluetooth accessories. 

Connecting With Bluetooth

Many mobile devices have Bluetooth radios embedded in them. PCs and some other devices that do not have built-in radios can be Bluetooth-enabled by adding a Bluetooth dongle, for example.

The process of connecting two Bluetooth devices is called "pairing." Generally, devices broadcast their presence to one another, and the user selects the Bluetooth device they want to connect to when its name or ID appears on their device. As Bluetooth-enabled devices proliferate, it becomes important that you know when and to which device you're connecting, so there may be a code to enter that helps ensure you're connecting to the correct device.

This pairing process can vary depending on the devices involved. For example, connecting a Bluetooth device to your iPad can involve different steps from those to pair a Bluetooth device to your car.

Bluetooth Limitations

There are some downsides to Bluetooth. The first is that it can be a drain on battery power for mobile wireless devices like smartphones, though as the technology (and battery technology) has improved, this problem is less significant than it used to be.

Also, the range is fairly limited, usually extending only about 30 feet, and as with all wireless technologies, obstacles such as walls, floors, or ceilings can reduce this range further.

The pairing process may also be difficult, often depending on the devices involved, the manufacturers, and other factors that all can result in frustration when attempting to connect.

How Secure Is Bluetooth?

Bluetooth is considered a reasonably secure wireless technology when used with precautions. Connections are encrypted, preventing casual eavesdropping from other devices nearby. Bluetooth devices also shift radio frequencies often while paired, which prevents an easy invasion.

Devices also offer a variety of settings that allow the user to limit Bluetooth connections. The device-level security of "trusting" a Bluetooth device restricts connections to only that specific device. With service-level security settings, you can also restrict the kinds of activities your device is permitted to engage in while on a Bluetooth connection. 

As with any wireless technology, however, there is always some security risk involved. Hackers have devised a variety of malicious attacks that use Bluetooth networking. For example, "bluesnarfing" refers to a hacker gaining authorized access to information on a device through Bluetooth; "bluebugging" is when an attacker takes over your mobile phone and all its functions.  

For the average person, Bluetooth doesn't present a grave security risk when used with safety in mind (e.g., not connecting to unknown Bluetooth devices). For maximum security, while in public and not using Bluetooth, you can disable it completely.

 

From <https://www.lifewire.com/what-is-bluetooth-2377412>

 

What are some security concerns?

Depending upon how it is configured, Bluetooth technology can be fairly secure. You can take advantage of its use of key authentication and encryption. Unfortunately, many Bluetooth devices rely on short numeric personal identification numbers (PINs) instead of more secure passwords or passphrases.

 

If someone can "discover" your Bluetooth device, he or she may be able to send you unsolicited messages or abuse your Bluetooth service, which could cause you to be charged extra fees. Worse, an attacker may be able to find a way to access or corrupt your data. One example of this type of activity is "bluesnarfing," which refers to attackers using a Bluetooth connection to steal information off of your Bluetooth device. Also, viruses or other malicious code can take advantage of Bluetooth technology to infect other devices. If you are infected, your data may be corrupted, compromised, stolen, or lost. You should also be aware of attempts to convince you to send information to someone you do not trust over a Bluetooth connection.

How can you protect yourself? 

·         Disable Bluetooth when you are not using it. Unless you are actively transferring information from one device to another, disable the technology to prevent unauthorized people from accessing it.

·         Use Bluetooth in "hidden" mode. When Bluetooth is enabled, make sure it is "hidden," not "discoverable." The hidden mode prevents other Bluetooth devices from recognizing your device. This does not prevent you from using your Bluetooth devices together. You can "pair" devices so that they can find each other even if they are in hidden mode. Although the devices (for example, a mobile phone and a headset) will need to be in discoverable mode to initially locate each other, once they are "paired," they will always recognize each other without needing to rediscover the connection.

·         Be careful where you use Bluetooth. Be aware of your environment when pairing devices or operating in discoverable mode. For example, if you are in a public wireless "hotspot," there is a greater risk that someone else may be able to intercept the connection (see Securing Wireless Networks for more information) than if you are in your home or your car.

·         Evaluate your security settings. Most devices offer a variety of features that you can tailor to meet your needs and requirements. However, enabling certain features may leave you more vulnerable to being attacked, so disable any unnecessary features or Bluetooth connections. Examine your settings, particularly the security settings, and select options that meet your needs without putting you at increased risk. Make sure that all of your Bluetooth connections are configured to require a secure connection.

·         Take advantage of security options. Learn what security options your Bluetooth device offers, and take advantage of features like authentication and encryption.

 

From <https://us-cert.cisa.gov/ncas/tips/ST05-015>

                                                    

History of Bluetooth

WLAN technology enables device connectivity to infrastructure based services through a wireless carrier provider. The need for personal devices to communicate wirelessly with one another without an established infrastructure has led to the emergence of Personal Area Networks (PANs).

·         Ericsson's Bluetooth project in 1994 defines the standard for PANs to enable communication between mobile phones using low power and low cost radio interfaces.

·         In May 1988, Companies such as IBM, Intel, Nokia and Toshiba joined Ericsson to form the Bluetooth Special Interest Group (SIG) whose aim was to develop a defacto standard for PANs.

·         IEEE has approved a Bluetooth based standard named IEEE 802.15.1 for Wireless Personal Area Networks (WPANs). IEEE standard covers MAC and Physical layer applications.

Bluetooth specification details the entire protocol stack. Bluetooth employs Radio Frequency (RF) for communication. It makes use of frequency modulation to generate radio waves in the ISM band.

The usage of Bluetooth has widely increased for its special features.

·         Bluetooth offers a uniform structure for a wide range of devices to connect and communicate with each other.

·         Bluetooth technology has achieved global acceptance such that any Bluetooth enabled device, almost everywhere in the world, can be connected with Bluetooth enabled devices.

·         Low power consumption of Bluetooth technology and an offered range of up to ten meters has paved the way for several usage models.

·         Bluetooth offers interactive conference by establishing an adhoc network of laptops.

·         Bluetooth usage model includes cordless computer, intercom, cordless phone and mobile phones.

 

For all discussed seminar topics list click here Index.

                                                                                             …till next post, bye-bye and take care.