Software
Computer software — often called just software — is made of one or more computer programs. Sometimes it refers to one specific program; other times it means all the software on a computer, including the applications and the operating system. Applications are programs that perform a specific function, such as a game or a word processor. The operating system is software that helps applications run and controls the display and keyboard.
The word software was first used in the late 1960s to distinguish it from computer hardware — the physical parts of a machine that can be seen and touched. Software is the set of instructions the computer follows. Before compact discs or internet downloads, software arrived on paper punch cards, magnetic discs, or magnetic tape.
The word firmware describes software made specifically for a particular type of computer or electronic device. It is usually stored on a flash memory or ROM chip and directly controls a piece of hardware, such as the firmware for a CD drive or a modem.
Software Categories
Computer software can be grouped into three broad classifications based on common function, type, or field of use.
| Category | Description | Examples |
|---|---|---|
| Application Software | Programs for performing user tasks. | Word processors, web browsers, games |
| System Software | Used to start and run computer systems and networks. Includes operating systems. | Operating systems, utilities |
| Computer Programming Tools | Used to create application and system software by translating source code into executable programs. | Compilers, linkers, development environments |
Operating Systems
An operating system (OS) is software that manages computer hardware and software resources and provides common services for computer programs. It is an essential component of system software — application programs usually require an operating system to function.
The operating system acts as an intermediary between programs and the computer hardware for functions such as input and output and memory allocation. Operating systems are found on many devices, from cellular phones and video game consoles to web servers and supercomputers.
Types of Operating Systems
By Task Handling
- Single-tasking — can only run one program at a time.
- Multi-tasking — allows more than one program to run concurrently through time-sharing. Can be preemptive, where the OS slices CPU time, or cooperative, where each process voluntarily yields control.
By User Count
- Single-user — no facility to distinguish users, though multiple programs may run in tandem.
- Multi-user — identifies processes and resources belonging to multiple users and allows them to interact with the system simultaneously.
By Architecture
- Distributed — manages a group of distinct computers and makes them appear as a single computer.
- Embedded — designed for embedded systems with limited resources; very compact and efficient. Examples include Windows CE and Minix 3.
- Real-time — guarantees processing of events within a defined time window; used in industrial control, spacecraft, and similar applications.
Common OS Families
| Family | Notable Members |
|---|---|
| Linux | Debian, Ubuntu, Linux Mint, Red Hat, Fedora, Android |
| BSD | FreeBSD, OpenBSD, NetBSD, macOS (derived from NeXTSTEP and BSD) |
| Microsoft Windows | Windows XP, Vista, 7, 8, 10, 11, and Windows Server editions |
| iOS and Android | Mobile operating systems; Android uses a modified Linux kernel |
Key OS Components
- Kernel — provides the most basic level of control over all hardware devices. Manages RAM access, determines which programs get which resources, sets CPU operating states, and organizes data storage.
- Device Drivers — translate OS commands into device-specific instructions, enabling hardware abstraction so applications do not need to know hardware details.
- User Interface — either a command-line interface (CLI), where the user types commands, or a graphical user interface (GUI), where a visual environment with windows and icons is present.
Functions of the Operating System
The primary purpose of an operating system is to provide a platform on which users can execute programs conveniently and efficiently. The OS manages the allocation of resources and services, including memory, devices, processors, and information.
Memory Management
The OS manages primary memory — a large array of bytes where each byte is assigned an address. The CPU must load a program into main memory before executing it. The OS tracks which memory addresses are in use, decides the order in which processes access memory, and allocates or deallocates memory as processes start or finish.
Processor Management
In a multi-programming environment, the OS decides which processes access the processor and for how long — a function called process scheduling. A program known as the traffic controller tracks process status, allocates the CPU to processes, and deallocates it when a process is complete or no longer required.
Device and File Management
The OS manages device communication through device drivers. It tracks all connected devices, designates an I/O controller for each one, and decides which process gets access to a device and for how long. For file management, the OS tracks storage locations, user access settings, and file status through a file system organized into directories.
Other Important Functions
- Security — password protection and prevention of unauthorized access to programs and data.
- Performance Monitoring — records response times to help diagnose and troubleshoot system health.
- Job Accounting — tracks time and resources used by tasks and users for reporting purposes.
- Error Detection — continuously monitors the system to detect and help avoid malfunctions.
- Resource Allocation — decides who gets what resource and for how long across all running processes.
Virtual Memory
Virtual memory allows the OS to use disk space to supplement physical RAM. Memory accessed less frequently can be temporarily swapped to disk, freeing RAM for active programs. This gives the user the perception that far more RAM is available than physically exists in the machine.
Multitasking
Multitasking allows multiple programs to appear to run simultaneously by rapidly switching between them. The OS kernel's scheduling program determines how much time each process gets and in which order — passing control back and forth in what is called a context switch. Modern operating systems use preemptive multitasking, ensuring no single program can monopolize the CPU indefinitely.
Linux vs. Windows
Two of the most widely used operating systems are Linux and Microsoft Windows. Each has distinct strengths depending on the context. Understanding their differences helps when selecting the right platform for a given need.
| Topic | Linux | Windows |
|---|---|---|
| Price | Free and open-source. Most distributions cost nothing to download and install. | Typically $99–$199 USD per licensed copy. |
| Ease of Use | Modern distributions such as Ubuntu and Linux Mint include GUIs and are increasingly user-friendly, with fewer pre-installed extras. | One of the easiest desktop OSes to use. Designed for user-friendliness, though power users may find system-level control limited. |
| Reliability | Notoriously reliable and stable with a strong focus on process management and uptime. | Improved significantly in recent years but historically considered less reliable than Linux. |
| Software | Thousands of free and open-source programs. Many Windows applications can run via compatibility layers such as WINE. | Largest selection of desktop software and video games by a wide margin. |
| Software Cost | Most programs are free and open-source, including applications such as GIMP and LibreOffice. | Many programs require purchase. Commercial software typically ranges from $4.99 to $99. |
| Hardware | Supports a wide range of hardware including older equipment. Hardware support has grown to match Windows over the past decade. | Virtually all consumer hardware supports Windows. Ideal for newer equipment. |
| Security | Highly secure. Open-source code allows community review, making vulnerabilities easier to identify and fix. | Improved greatly over the years, but as the most widely used OS it remains the primary target for malware and viruses. |
| Support | Large online communities, extensive documentation, and enterprise support options available. | Integrated help systems, vendor support, and thousands of reference books at every skill level. |
| Common Use Cases | Servers, cloud infrastructure, embedded systems, scientific computing, and users who prioritize security and reliability. | Desktop users, gamers, business environments relying on Microsoft software, and novice users. |
What Is the SDLC?
The System Development Life Cycle (SDLC) is a structured methodology and process that guides the development of an information system. It is based on a series of related activities combined into phases — sometimes called life-cycle phases — that represent a state or stage in the life of an information system.
Generally speaking, an information system life cycle proceeds from requirements gathering to design and development, to operations and maintenance, and finally to decommissioning. Each successive phase leverages the documentation and knowledge gained from the previous phases.
Why the SDLC Matters
The main purpose of using the SDLC is to promote quality during the design, development, and implementation effort. When used properly, an information system is more reliable and cost-effective because project activities are planned, documented, tracked, and controlled.
An information system is more than just software and hardware. Effective use of technology also depends on solid processes, procedures for meeting business objectives, and skilled people who operate and manage the system. The relationship between technology, processes, and people is symbiotic — any change to one component affects the others.
A life-cycle approach ensures there is a clear plan for identifying and validating requirements early, designing and developing based on those requirements, deploying the completed system, operating and maintaining it over time, and decommissioning it when no longer needed.
SDLC Phases and Approaches
While the number of SDLC activities varies depending on project type and complexity, common guidelines allow activities to be grouped into clearly defined phases. Following these guidelines helps mitigate risks from missed requirements, schedule delays, and cost overruns.
Four-Phase Approach
Used when an organization has a good understanding of its requirements. It groups related activities into four major phases.
| Phase | Key Focus |
|---|---|
| Planning | Determine whether the system is needed and what it will do. Assess technical, financial, and operational feasibility. Create a project plan and assign a project manager. |
| Analysis | Decide whether to proceed and verify resources are available. Conduct a feasibility study. Identify improvements and confirm budget and resources exist. |
| Design | Develop a detailed plan — blueprints, not working software. Define system architecture, data structures, user interfaces, and security controls. |
| Implementation | Build, test, and deploy the system. Includes coding, installation, user training, documentation, and ongoing maintenance. |
Nine-Phase Approach
A more granular model suited for organizations implementing an unfamiliar type of information system or one deployed across all business units. It expands the preliminary investigation, requirements analysis, and recommendation activities into distinct phases.
| Phase | Key Activities |
|---|---|
| Initiation | Develop business case, identify project sponsor, appoint project manager, develop and approve concept proposal. |
| System Concept Development | Analyze business need, form project team, plan project, develop acquisition strategy, identify risks, obtain funding. |
| Planning | Refine acquisition strategy, analyze schedule, establish stakeholder agreements, develop project-management plan. |
| Requirements Analysis | Define functional and technical requirements, conduct reviews, and approve requirements. |
| Design | Design system and business processes, outline operations and maintenance manuals, outline deployment plan, approve design. |
| Development | Refine software requirements and design, acquire hardware, code and test software, install, complete documentation. |
| Integration and Test | Subsystem and system testing, security testing, user-acceptance testing, obtain user acceptance. |
| Implementation | Execute training plan, perform data migration, install new system, conduct post-implementation evaluation. |
| Operations and Maintenance | Operate system, perform maintenance, monitor for changes, recommend modifications and updates. |
Ten-Phase Approach
Used by the U.S. Department of Justice, this approach is similar to the nine-phase model but adds a dedicated Disposition phase. This phase formally plans how to safely retire and shut down the information system when it is no longer needed. End-of-life planning in this phase often feeds back into a new SDLC cycle for the replacement system.
SDLC Models
SDLC models are frameworks that help project and development teams correctly follow the stages required to develop various types of information systems. Different models suit different project contexts.
Waterfall Model
The waterfall model is often considered the foundation of modern SDLC methodology. This linear, sequential model has been in use in one form or another since the 1960s. Each phase must be fully completed before the next begins — like water flowing over a cliff, there is no going back.
- Advantage. Allows for direct management control. A clear timeline can be established with specific deadlines, and the project moves through development like a product on an assembly line.
- Disadvantage. No returning to a previous phase. Once in design, it is difficult to modify a requirement that was not well thought out earlier. Complex systems often require a more iterative approach.
Fountain Model
The fountain model recognizes that phases may overlap and that earlier phases may have to be revisited as the team learns more. Like water in a fountain, details are pushed upward through the phases but can flow back down at any time to be refined.
- Advantage. Changes can be made to system components as the team uncovers gaps in the concept, requirements, or design.
- Disadvantage. May take more time and cost more. Without strong project management, the project may never finish if the team gets caught in ever-increasing scope and changing requirements.
Build-and-Fix Model
Build-and-fix is the simplest and least structured model in the SDLC family. A working prototype is built and modified as often as necessary until it satisfies user needs, with no upfront requirements analysis or design.
- Advantage. Efficient for very small, low-priority efforts involving a single customer. Ensures frequent customer involvement in development.
- Disadvantage. Usually costs more than a properly planned approach. Strongly discouraged except for small, low-priority projects.
Rapid Application Development (RAD)
RAD is used when software development is heavily dependent on end users' knowledge of business processes. Users examine a live prototype and provide feedback rather than commenting on documentation — similar to a tailor making a custom suit through multiple fitting sessions.
- Advantage. Lower rejection rate when the system goes into production. Design errors can be caught earlier. End users take ownership of the finished product.
- Disadvantage. Risk of cost and schedule overruns. Users may underestimate development complexity and continuously request new enhancements, leading to scope creep.
Agile Model
The agile model builds systems in small incremental releases, each tested by customers. It requires close collaboration between customers, developers, and testers throughout the project. Rather than waiting until the end to deliver a finished product, agile teams build a small part of the system, test it, get feedback, improve it, and repeat.
Key Characteristics
- Short development cycles called iterations or sprints
- Frequent testing and user review of each release
- Close communication between users and developers throughout the project
- Flexibility to accommodate changing requirements
When Agile Works Best
Agile is most effective when requirements are unclear at the start, user needs may change over time, or fast feedback is critical to success. It helps teams avoid investing in building the wrong system.
Agile Challenges
Agile can struggle when users are unsure of what they want or project goals keep shifting. Without strong project management this can lead to rework, delays, and confusion about direction.
Testing and Project Management
Regardless of the model used, testing should be incorporated into every phase of the life cycle — not reserved only for the end of development.
Types of Testing
| Test Type | What It Does |
|---|---|
| Unit Test | Focuses on a single subsystem to verify it operates correctly and produces results according to its specifications. |
| Integration Test | Uses real data to test whether multiple units continue to work properly together. Verifies that output from one unit is correctly applied to another. |
| System Test | Determines whether all components of the system work together seamlessly. Especially important when different teams build separate subsystems. |
| Acceptance Test | End users verify that the system performs according to their specifications before formally accepting the product. |
The SDLC and Project Management
Project management is a discipline that uses a systematic process to plan, manage, execute, and control projects. When an information system is needed, the SDLC provides the framework while project management provides the discipline to plan, schedule, and control the associated activities.
As an information system moves through its life-cycle phases, it may spawn several separate projects — for example, to determine the business need, to evaluate and select a vendor, to update an aging system, or to formally decommission a retired one. In most cases, the project ends when the system moves into the operations and maintenance phase.
Conclusion
Software is what makes hardware useful. Operating systems provide the platform that all other programs depend on, managing memory, processors, devices, and files so that applications can run without needing to handle hardware directly. Understanding the difference between Linux and Windows, and knowing when each is appropriate, is a practical skill in any technology environment.
The SDLC gives organizations a structured way to move from idea to working system to eventual retirement. Whether using a simple four-phase approach or a detailed nine-phase model, the goal is the same: reduce risk, improve quality, and ensure the system actually meets the needs that motivated it. Different models — waterfall, fountain, RAD, agile — reflect different assumptions about how well requirements are known at the start and how much flexibility the project needs.
Information systems do not build themselves and do not maintain themselves. They require planning, structured development, ongoing care, and deliberate decommissioning. The concepts in this unit describe how that process works.