Skip to main content

CLOUD COMPUTING

CLOUD COMPUTING
Cloud computing is a concept that enables sharing of computing resources rather them having local servers or personal devices.
It is a common facility and works as an application that can be utilized or used by different users. The cloud is a metaphor for the Internet. Some other major examples of clouds computing applications are Google Drive, Apple I Cloud etc.

Advantages of Cloud Computing 

Cost: Cloud computing eliminates the capital expense of buying hardware and software and setting up and running on-data data centers, set up set up cost of servers.
Productivity: On-site data centers typically require a lot hardware set up, software patching and other time-consuming IT management. Cloud computing removes the need for many of such takes, so the IT teams can spend time on achieving more important business goals.
Performance: The biggest cloud computing services run on a worldwide network of secure data centers, which are regularly upgraded to the latest generation of fast and efficient computers hardware.
Reliability: Cloud computing helps in regular data backup & has provides facilities for data recovery as data gets mirrored at multiple redundant site on the cloud provider's network.

Comments

Popular posts from this blog

Computer Generations

 Computer Generations   Computers have evolved over time through different generations, each marked by significant technological advancements and changes in design and architecture. These generations are often categorized based on the underlying hardware and the key developments that distinguish one Generation from another. Let's explore the main computer generations: First Generation Computers    The First Generation of computers refers to the initial period of electronic computing, characterized by the use of vacuum tubes as the primary electronic component. This Generation spans the 1940s and 1950s and is marked by groundbreaking developments in computer technology.  Examples of First-Generation Computers:  o ENIAC (Electronic Numerical Integrator and Computer): Completed in 1945, ENIAC was one of the first electronic general-purpose computers. It was designed to calculate artillery firing tables for the United States Army during World War ...

Programming in "C"

 Introduction   C is a general-purpose programming language that was developed in the early 1970s by Dennis Ritchie at Bell Laboratories. It has since become one of the most widely used programming languages and has influenced the development of many other languages, including C++, Java, and C#. C is known for its efficiency, flexibility, and low-level programming capabilities. It is commonly used for system programming, embedded systems, and developing operating systems. C is also a popular choice for developing applications that require high performance, such as game engines and scientific simulations.   Here are some key features and concepts of C programming:   Syntax: C has a relatively simple syntax compared to some other programming languages. It uses a combination of keywords, variables, data types, operators, and control structures to write programs.  Variables and Data Types: You declare variables to store data in C. Various d...

History of Language "C"

  History of C      The history of C programming dates back to the early 1970s when Dennis Ritchie created it at Bell Laboratories. Here's a brief timeline of the major milestones in the history of C:  1969: The development of C began as a successor to the B programming language, which Ken Thompson created. Dennis Ritchie wanted to improve upon B's capabilities and efficiency.  1972: The C programming language was developed primarily to support the development of the Unix operating system. Ken Thompson and Dennis Ritchie rewrote Unix in C, which helped in porting the operating system to different computer architectures easily. 1978: The first edition of "The C Programming Language," also known as the "K&R C," was published. Brian Kernighan and Dennis Ritchie wrote it, becoming the authoritative reference for C programming. 1983: The American National Standards Institute (ANSI) formed a committee to establish a standard for the C language. The ...