Click here to close now.


Cloud Security Authors: Anders Wallgren, Elizabeth White, Jason Bloomberg, Mav Turner, Brad Thies

Related Topics: Cloud Security, Java IoT, Microservices Expo, Microsoft Cloud, Agile Computing

Cloud Security: Article

BIOS: Overview and Security

Basic Input/Output System (BIOS), also known as the system BIOS or ROM BIOS, is a standard defining a firmware interface

Computer security has become much harder to manage in recent years, and this is due to the fact that attackers continuously come up with new and more effective ways to attack our systems. As attackers become increasingly sophisticated we as security professionals must ensure that they do not have free rein over the systems that we are hired to protect. An attack vector that many people forget to consider is the boot process, which is almost completely controlled by the BIOS.

The BIOS is a privileged piece of software that is generally ignored by day-to-day users and thus they are usually unable to comprehend the importance of it in our computers. The Basic Input/Output System was first invented by Gary Kildall for use in his operating system CP/M and this became what we now know as the conventional BIOS system. The BIOS appeared in IBM-compatible PCs around 1975 and was used extensively in the CP/M operating system. This was later used in the MSDOS systems where it was known as DOS BIOS. These systems were only responsible for basic preboot hardware initializations before handing over control to the bootloader. This was fine 30 years ago, when software was simpler and attacks were not very predominant, thus the BIOS was not designed with security in mind. However, in today's world this is no longer the case. BIOS security lacks several features that make it vulnerable to external attack.

These are some notable attacks carried out against BIOS systems:

Chernobyl Attack (1998) - Also known as CIH or Spacefiller was the first major attack on BIOS systems. This virus installs on the windows memory and hooks into file access calls and infects all the currently executing programs. Then the virus tries to flash the BIOS rom by filling it with zeros. The other payload infects the Master Boot Record (MBR) by filling the first megabyte of the hard disk with zeros.

Mebromi (2012) - Is made up of a BIOS rootkit, MBR rootkit, Trojan downloader and PE infector. This Trojan deletes a specific registry value and checks for the BIOS manufacturer. If it's Award BIOS, it then infects the BIOS ROM and in turn infects the Master BOOT Record (MBR) and alters it allowing the execution of an infected program at each Operating System start-up.

We attempt to prevent such attacks by outlining several attack vectors and also suggest several mechanisms for the mitigation of attacks against the BIOS.

BIOS (Basic Input Output System)

Basic Input/Output System (BIOS), also known as the system BIOS or ROM BIOS, is a standard defining a firmware interface. BIOS software is built into the PC, and is the first software run by a PC when powered on. The fundamental purposes of the BIOS are to initialize and test the system hardware components, and to start the boot loader or an operating system from a secondary storage device. It also takes care of essential system functions such as power management and temperature regulation. It provides an abstraction layer for the underlying hardware by providing a consistent way for operating systems and application programs to interact with various input/output devices.

Changes in system hardware are abstracted by the BIOS from programs that use BIOS services instead of directly accessing the hardware. BIOS software is stored on a non-volatile ROM chip on the motherboard. Its unique design makes it compatible for particular models of computer, interfacing with various devices that make up the complementary chipset of the system. In modern PCs the BIOS contents are stored on an EEPROM chip.

An EEPROM chip or Electronically Erasable Programmable Read only memory is a type of non-volatile memory used by many electronic devices that requires small amounts of data to be stored for quick access. The contents of an EEPROM chip can be flashed, i.e., they can be overwritten with new data. This allows BIOS software to be easily upgraded to add new features and bug fixes. This feature is also one of the reasons that BIOS chipsets are vulnerable to attack.

Why BIOS Is in Blue
Most BIOS Screens will be blue; this is due to how the BIOS Manufacturers implement general BIOS color attributes. BIOS Color Attributes are 8-bit values where the lower 4 bits represent the character color and the higher 4 bits represent the background color. In BIOS, to print a white character in blue background the ‘BIOS colour attribute' would be set to a hexadecimal value of 0x1F.

Under certain conditions, setting the highest bit of the background color may cause the text to blink instead of making the background color intensified. In this context the highest bit of the background color should be kept low according to the BIOS color attribute distribution. As a result the Blue color which comprises value '1′ in hexadecimal is generally used for an uninterrupted BIOS display with intensified background with clear text.

Top BIOS Manufacturers
BIOS software is developed by several companies around the world and are usually deeply integrated with the system motherboard. Several of the most popular BIOS manufacturers are:

  • American Mega Trends
  • WinBond
  • Phoenix
  • AMI
  • IBM
  • Award

Role of BIOS
The BIOS has an essential role in the boot process of the computer also known as bootstrapping. It initializes system hardware, manages ACPI, and regulates CPU temperatures during the booting process. The major responsibilities of the BIOS are listed below:

  1. Establish Trust: The BIOS is responsible for verifying the integrity of all the hardware components in the system and also to authenticate them before use. This is done with the help of Core Root of Trust Measurement (CRTM), which basically checks if the hardware is valid and that its integrity has not been compromised.
  2. Test Hardware: The secondary functionality of the BIOS is to initialize and test the hardware present on the computer before it's used. Hardware such as the motherboard, chipset and memory are included in this test. This is generally carried out during POST (Power-On-Self-Test).
  3. Load additional Modules: Several devices present on the computer may require additional firmware for its proper functioning. The BIOS ensures that such additional firmware modules are loaded and executed. These may be stored in the BIOS chip itself or some secondary storage device.
  4. Boot Device Selection: After the above steps have been carried out, the BIOS starts to detect a valid boot device, e.g., USB drives, dard disk, etc. Once such a device has been found it executes the bootloader found on that device.
  5. Start Operating System: After this the actual bootstrapping process begins, the bootloader starts to execute and begins to load the OS kernel into memory. Once the kernel has been initialized the BIOS transfers full control to the Operating System.

BIOS Overview
System BIOS can be of two types namely:

  • Legacy BIOS
  • BIOS based upon the UEFI specification

Conventional BIOS (Legacy BIOS)
The Legacy BIOS or conventional BIOS is the tried and true BIOS type that has been around for years. It's generally a 16-bit program that is flashed onto a ROM chip and placed in the motherboard of the computer. This type of BIOS is very outdated and more vulnerable to attack, therefore it's advisable to use a newer and more stable specification.

The key component in conventional BIOS is a boot block. This part is logically separated from other parts of the BIOS and initially executed during the BIOS boot process. Then the boot block checks the integrity of the remaining firmware in BIOS and if any is corrupted recovers those. The boot block then initializes almost all the hardware associated with the system by using a Power-On-Self-Test (POST). During this procedure low-level hardware components like Memory, CPU, Chipset, etc., are initialized.

After this process, it then loads other option ROMS like Video Cards, SCSI Controller Cards, and Network Boot ROM that have their own BIOS software. This Option ROMS could inform the BIOS about its functionality, and then it could be called later on in the boot process depending on the order the user had selected. Then the BIOS checks the Master BOOT Record (MBR) in the order of the boot device's priority. If any storage device has a valid data that relates to MBR, it is selected. MBR then points to a corresponding boot loader of an operating system and thus in turns loads the operating system.

In a conventional Boot process, the System Management Mode (SMM) can be initiated by using SMI handlers and ACPI table's code. System Management Mode is a 32-bit mode that runs on high-privileged mode that can override almost all the hardware security mechanisms of the protected mode. In order to change to SMM mode, BIOS loads SMI handlers and initializes the ACPI tables and codes

Legacy BIOS Boot Process
When a computer is first powered on the BIOS is the first piece of software that is executed. The boot block then executes a POST (Power-On-Self-Test), thereby ensuring that all the hardware on the system is valid and accounted for. After the POST screen the user has the option to load the BIOS screen or to continue booting the current operating system instead. This is done by pressing a pre-designated key on the keyboard. This key may vary depending on the BIOS manufacturer. The BIOS then checks if any additional pieces of firmware have to be loaded for individual devices on the computer if so, then these modules are loaded and executed.

Unified Extensible Firmware Interface (UEFI)
UEFI (Unified Extensible Firmware Interface) is a specification that was first designed by Intel in the 1990s for its Itanium range of computer systems. It was originally called the EFI specification and was intended to be a better replacement for legacy BIOS systems. UEFI has several advantages over the conventional BIOS and is radically different from these older systems.

The UEFI specification defines a programmable software interface that lies between the device firmware and the operating system. It provides an almost OS-like interface to device firmware. Depending on the manufacturer it may lie on top of the BIOS but it's generally placed in the /EFI/ directory on some form of non-volatile memory. This may either be a NAND chip on the motherboard, a hard drive or even on a network share.

Differences Between UEFI and Legacy BIOS
There are several differences between conventional BIOS and UEFI systems and many of them add greater functionality and power to the computer. It also provides a more efficient and secure booting mechanism.

  1. Larger Address Space: Conventional BIOS were forced to work in a 16-bit mode with a maximum of 1mb addressable space. UEFI allows running in 32- and 64-bit mode allowing larger and more sophisticated programs to be run by the UEFI.
  2. Support for Larger File Systems: Traditional BIOS only supports the booting of disks that have MBR partitions. MBR partitioning schemes only support 4 partitions per disk and a maximum size of 2TB. UEFI supports the booting of GPT partitions (GUID partitions), which allow the booting of extremely large disks up to 8Zb.
  3. Improved Security Capabilities: The UEFI specification also improves on the security aspects of the older BIOS systems. It supports several security features such as secure boot. It also has provisions for providing basic cryptographic and public key infrastructure.
  4. CPU independent design: UEFI has employed a CPU independent design methodology, i.e., it can run on many different types of architectures. The code available is compiled differently for the required platform.
  5. Powerful Execution Environment: The UEFI specification provides a much more powerful execution environment for computers. It allows special features such as booting over a network, using the mouse, ACPI control and even browsing the web.
  6. Improved Performance: UEFI-compliant operating systems have been seen to have a significant performance boost not just during the boot process but also during running and powering off the system.

Windows 8 uses UEFI
UEFI, although supported by several operating system vendors for years, has not seen widespread adoption until the release of Windows 8. Windows 8 has tried to incorporate the best parts of UEFI into their latest operating system release particularly the secure boot feature.

Secure Boot
One of UEFI's most interesting feature is called Secure Boot; it allows you to boot only an authenticated OS kernel. Windows 8 relies heavily on this method to ensure that only authenticated firmware with a validated kernel image can be booted. This is quite different from older bootstrapping methodologies where any kind of bootloading code can be loaded and executed by the BIOS.

In secure a boot before the BIOS gives full control to the OS, bootloader makes sure that the firmware has been signed. This is done with the help of cryptographic signatures that are embedded on the firmware by the OEM. During the boot process the firmware will compare the platform key with the key present in the firmware of each device. This comparison is carried out between a database of authenticated valid keys; if the key is allowed then the firmware is allowed to execute, otherwise it is rejected.

This allows only authenticated devices to be loaded and ensures that malicious bootloader code is not loaded and executed. The safe boot mechanism in Windows 8 significantly reduces the chances of boot sector viruses and bootkits from launching and affecting the boot process of the machine.

UEFI Boot Process
The UEFI boot process is much like the boot process in conventional BIOS with a few minor changes. The process is divided into stages that take place sequentially and ends with the complete handover of control to the operating system. UEFI booting runs in a 32-bit or 64-bit protected mode on the CPU, not in a 16-bit mode, unlike legacy BIOS.

UEFI also starts with a small amount of code that begins the execution of the entire booting process. This phase is called the security phase (SEC) and it acts as the core root of trust. This is followed by the Pre-EFI initialization (PEI). This mode is similar to the Legacy bios pre-boot initialization phase in which device firmware is checked before boot. Then the driver execution environment is started where the actual initialization of extra device drivers takes place; devices such as network cards and graphic cards are checked in this phase.

The boot device is selected during the BDS (Boot Device Selection) phase. This then transfers control to the bootloader that is located in a GPT partition; the bootloader handles the loading of the OS kernel into memory.

Common BIOS Threats
BIOS is always written to a non-volatile storage device such as an EEPROM which allows the content of the ROM to be overwritten to introduce bug fixes and updates for the particular BIOS version. However this also has great potential for misuse such that malicious programs may also have the ability to modify the contents of the ROM disk if given enough access.

User Initiated Attack
This type of attack is carried out by an end user who uses an unauthenticated file to update the BIOS.. This can be carried out by an end user who doesn't have prior knowledge about the update file or a user with malicious intent.

Malware Attack
Malware attacks can be used to exploit vulnerability in bios. The attacker opens a backdoor to the system and cause a BIOS. crash using a vulnerable update version of BIOS.

Network Based or Organizational Attack
This is a large scale and crucial attack on an organizational basis. An attacker who gets access to a compromised update server can carry out an organizational wide attack and infect all systems by rolling out all authorized BIOS. versions to malicious ones.

How Do We Mitigate Common BIOS Threats
This section describes the security measures that an organization should implement in order to secure the BIOS. Since vulnerability in BIOS is crucial point to a system, it is important that every organization should follow predefined guidelines to secure the BIOS structure. The following method can be implemented in an enterprise structure to enhance BIOS security

In order to overcome the malicious attacks on BIOS, we can implement following methods:

  • Digital Authentication Method
  • Rollback Prevention Method
  • Physical Authentication Method

Automated Authentication Method
In this method, the authenticity of BIOS can be ensured through digital signatures. Here BIOS updates should only be installed if its authenticity is verified. Here digital signature embedded update images by BIOS manufactures will be the last level of authorization. This process can be automated by using a signature verification algorithm that ensures the validity of the digital signatures. This digital authentication method must be integrated by providing strong security features.

Rollback Prevention Method
Implement a mechanism that ensures the update images of BIOS so that it will not be rolled back to previous versions. This method ensures that if an update image is to be installed, it should be ensured that its version number is greater than the current one. This can ensure that the b BIOS version is not roll backed to a previous image that contains vulnerability.

In some cases if the current higher version has to be rolled back to a previous lesser version, i.e., if the current updated version of the BIOS contains vulnerability and there are no higher version updates to be installed and the earlier lesser version is stable than the current one. In this case the corresponding authority has to ensure that the lesser version does not contain any vulnerability.

Physical Authentication Method
This method provides the authenticity of updated images by ensuring the physical presence of the corresponding authority (system administrator). Here the authorities can verify the update images and update the BIOS, if the image is a valid one. This method can be used as a subsidiary to digital authentication method by providing as a recovery mechanism in situations like bios crash.



More Stories By Albert Fruz

Albert Fruz has 5 years of experience in the information security field, encompassing SIEM, malware analysis, ISO 27001 audits, rule-based auditing for firewall forensics and PCI dss audits.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Ben Perlmutter, a Sales Engineer with IBM Cloudant, demonstrated techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk was on IBM Cloudant, Apache CouchDB, and ...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" in this scenario: microservice A (releases daily) depends on a couple of additions to backend B (re...
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content. Join @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 7-9, 2016 in New York City, for three days of intense 'Internet of Things' discussion and focus, including Big Data's indespensable role in IoT, Smart Grids and Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) IoT's use in Vertical Markets.
Container technology is shaping the future of DevOps and it’s also changing the way organizations think about application development. With the rise of mobile applications in the enterprise, businesses are abandoning year-long development cycles and embracing technologies that enable rapid development and continuous deployment of apps. In his session at DevOps Summit, Kurt Collins, Developer Evangelist at, examined how Docker has evolved into a highly effective tool for application delivery by allowing increasingly popular Mobile Backend-as-a-Service (mBaaS) platforms to quickly crea...
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...