The Last Lecture by Randy Pausch
Randy Pausch was a Carnegie Mellon University’s computer science graduate and a wall street journalist who gave a lecture about time management when his living days were only numbered as he was diagnosed by cancer. Mixing humor with wisdom, Pausch starts off by discussing the importance of childhood dreams and how significant are some factors like good parents, to achieve those dreams and how much work one should put in order to live those dreams. He further tells some of his own personal tales on how he managed to live his childhood dreams.
He talks about some other important things in his lectures which include the significance of enjoying your work in order to reach at your maximum potential. He argues that when you start relishing your work, the progress of the work is very fast and it’ll be finished way before your expectations. He also advises to try novel and innovative things that you have never considered before. According to him, if you get out of your comfort zone and experience different aspects of life, you’ll be amazed by your skills and potentials. He also emphasizes the significance of prioritizing tasks. According to him, the tasks should be carried out according to their importance and you should get over with the most important tasks first. He also suggests that the key to manage your time is to not waste time on unimportant activities that have no output. For example, spending too much time with friends and family, while you have some important task to perform is wasteful and learning to say ‘No’ to such things will help generate enormous amount of time.
He concludes his lectures by suggesting that eating, sleeping and working habits are key to a person’s success and if you manage your time effectively and efficiently, there is nothing that you can’t do.
In the fast paced world as of today, it is extremely difficult to continuously ensure in a corporate environment that all employees in a company are using up-to-date hardware and all their software are individually licensed for each machine- so on and so forth. Thus a much simpler technology has been developed in the recent years to take care of this; it runs all your complex programs, word processors, email clients and social networks etc. from remote machines and servers. The user only needs a machine that is capable of running the “interface software”, a web browser only in most cases, and it can perform the same tasks as any other high-end machine.
Services like Facebook, Twitter, Gmail, Dropbox and many other including Apple’s iCloud are making use of this technology. These services are delivered over the internet, but others may also be operated on a local network.
Some of the types of Cloud Computing are as follows:
· Infrastructure as a service (IaaS)
· Platform as a service (PaaS)
· Software as a service (SaaS)
· Storage as a service (STaaSZ)
· Desktop virtualization
· API as a service
· Security as a service (SECaaS)
· Data as a Service (DaaS)
Cloud Computing may be deployed on various levels. Public cloud makes services available to the general public, at a cost or free depending on the different scenarios. These services are offered via the internet. An example is Google Maps.
A Private cloud is a cloud infrastructure that is operated for a single company, which may be managed internally by the IT department or by a third-party host.
Community cloud is also used to share the infrastructure between various organizations which have converging interests or some common ground. This can be managed, again, internally or externally. One of the benefits is the sharing of costs between the parties involved in this form of cloud computing.
A Hybrid cloud can be implemented as well, which may consist of two or more of the above mentioned types of clouds. This form of cloud architecture enables the users to be less dependent on internet connectivity. Also, a higher level of fault tolerance and scalability can be experienced.
Despite its plentiful advantages, this form of computing has faced criticism as well, one of which is the concern regarding the infringement of privacy by the host companies. An example is the secret NSA Program which in collaboration with AT&T and Verizon, recorded over 10 million phone calls between users raising many questions regarding the secure transmission of users’ data.
Computer networks are collection of computers and hardware that are connected to each other so that communication can take place between all the connected computers. Its main purpose is to connect different computers to each other so that files can be shared and communication is possible. For example, in an office where many workers need to access to information from other workers and need to forward their work to their colleagues, the process of computer networks comes in handy.
The connection of computer networks can be of types which include wired connections and wireless connections. In wired connections, the computers are connected to each other with the help of wires and communication takes place through electrical signals in the wires. Sometimes a server controls all the traffic and directs the file transfer and communication to its appropriate destination. Wireless communication takes place by the transfer of signals through waves. It needs certain modems and routers to send and receive signals but is useful as it does not need wires to connect each computer with all the others. In terms of performance, wired networks are more useful as they transmit signals through wires which are fast and saves time. The speed of transferring files is also usually more in wired networks.
So, where the computers are placed close to each other and time is a problem, wired connections are preferred as they best serve the purpose. Wireless connections are used where wiring all the computers is a problem and computers are placed far off from each other.
The processes whereby computer equipment, data, information and services are protected from being intercepted by untrusted parties, changed or destroyed are all categorized under Computer Security.
Security can be implemented on various levels. On a software level, the primary layer on which security is highly important is the Operating System. This form absolutely enforces certain security policies into the system kernel which makes it impossible for unwanted elements to penetrate into the machine.
If the Operating Systems fall under the low security category, then another layer on which security must be introduced is referred to as Secure Coding. Some secure coding practices are then used by programmers to make the applications more resistant to viruses or malware. Languages like Java are more resilient to such attacks by “intruders” etc. but other languages like C and C++ are at times not equally secure.
Security can be provided against the internet viruses by using firewalls which scan the system for any suspicious activities. Intrusion-detection system (IDS) may also be implemented to monitor network and system activities which automatically notify the authority when it detects potential intrusion.
As preventative measures, the following techniques are used:
· Digital signatures
· Smart cards
· Encryption of data
Sometimes security is considered at the time of hardware design of computers too. Some approaches to this method are:
· Automated theorem proving
· Audit trials
· Principles of least privilege
· Full disclosure
The reason why Computer Security Bills are constantly under debate in the United States and other developed countries is because of the potential damage they can cause. In practice, the need for computer security and the hazards if such security is not in place can vary to a large extent. They range from leakage of confidential data including credit card details, addresses and phone numbers to even risking human life. Cybercrime is not a novel concept in today’s world; incidents of bank thefts from remote computers have taken place recently. In the aviation industry, for example, it is essential to be vigilant of any computer vulnerabilities which might lead to terrorist attacks, loss of expensive equipment, luggage, or might put human lives on stake.
Recent security breaches include:
· Japan Space Agency’s rocket information theft
· Aramco Oil Company’s data theft (Saudi Arabia)
Data Mining is a process that is used to handle large sets of data by finding a pattern in a large amount of data and help understand that large chunks of information for further use. It adopts the methods of artificial intelligence, machine learning, statistics and database systems to interpret the data and convert it into an understandable structure.
Data mining extracts useful and relevant information from the data and transfers it into a database for further use. This process is called modeling. The information after it is recognized and classified into usable form is then used in certain different situations where it can be used to help solve different problems. Data mining is used in almost every field ranging from gaming to controlling business.
It is the process of selecting the data that is to be processed.
This process sets apart the target data by observing and finding patterns in the selected data.
In this process the data is transformed and converted into a form on which data mining can easily take place.
(4) Data Mining
This is the main process and it consists of 6 different processes.
· Anomaly detection – The process to identify abnormal and unusual data with errors
· Associate rule learning – It searches for relationships between variables
· Clustering – It is the task of identifying similar groups of data and putting them together
· Classification – After identifying and organizing similar groups of data, the data is given a name or type and is classified into different categories
· Regression – This finds the data which is the least abnormal and has the least errors.
· Summarization – Process of summarizing the data and giving it a compact representation.
This is the process of recognizing and identifying useful patterns of data as not all patterns found are useful. This process is carried out by testing the data.