Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2010, Proceedings of the Decision Sciences Institute
What is the future of computing? It is hard to predict. Most often there are disruptive technologies that change the direction, the market and derail a technology plan. So in examining the future of computing, we need to examine where we have been. This is not to say that we are returning to punch cards and 300 baud modems, but examining the obstacles of the past can show us a roadmap of the future and what emerging technologies may look like.
Progress in computer technology over the last four decades has been spectacular, driven by Moore's Law which, though initially an observation, has become a selffulfilling prophecy and a board-room planning tool.
Progress in computer technology over the last four decades has been spectacular, driven by Moore's Law which, though initially an observation, has become a selffulfilling prophecy and a board-room planning tool. Although Gordon Moore expressed his vision of progress simply in terms of the number of transistors that could be manufactured economically on an integrated circuit, the means of achieving this progress was based principally on shrinking transistor dimensions, and with that came collateral gains in performance, power-efficiency and, last but not least, cost.
Proceedings of the International Astronomical Union
According to aTop500.orgcompilation, large computer systems have been doubling in sustained speed every 1.14 years for the last 17 years. If this rapid growth continues, we will have computers by 2020 that can execute an Exaflop (1018) per second. Storage is also improving in cost and density at an exponential rate. Several innovations that will accompany this growth are reviewed here, including shrinkage of basic circuit components on Silicon, three-dimensional integration, and Phase Change Memory. Further growth will require new technologies, most notably those surrounding the basic building block of computers, the Field Effect Transistor. Implications of these changes for the types of problems that can be solved are briefly discussed.
IEE Proceedings - Nanobiotechnology, 2004
All modern computers are designed using the 'von Neumann' architecture and built using silicon transistor technology. Both architecture and technology have been remarkably successful. Yet there are a range of problems for which this conventional architecture is not particularly well adapted, and new architectures are being proposed to solve these problems, in particular based on insight from nature. Transistor technology has enjoyed 50 years of continuing progress. However, the laws of physics dictate that within a relatively short time period this progress will come to an end. New technologies, based on molecular and biological sciences as well as quantum physics, are vying to replace silicon, or at least coexist with it and extend its capability. The paper describes these novel architectures and technologies, places them in the context of the kinds of problems they might help to solve, and predicts their possible manner and time of adoption. Finally it describes some key questions and research problems associated with their use.
International Conference on eBusiness, eCommerce, eManagement, eLearning and eGovernance 2014, 2014
The presentation reviews, and raises questions about, the way in which personal computer and mobile communications technologies will be used in the near and medium-term future. The evolutionary paths of the relevant technologies are reviewed and it is argued that they are converging rapidly. Inasmuch as mobile communications have a clearer forward plan and a larger sales base, they appear set to become a dominant paradigm. However, the lack of vision regarding the way in which they might be used, and the slow rate of development of appropriate content, suggest that the industry needs to consider the wider implications and to think very radically and creatively about future applications. Work to develop new applications and the media content to display on phone-type screens will be reviewed.
This article aims to present how the computer, humanity's greatest invention, evolved and how its most likely future will be. The computer is humanity's greatest invention because the worldwide computer network made possible the use of the Internet as the technology that most changed the world with the advent of the information society. IBM developed the mainframe computer starting in 1952. In the 1970s, the dominance of mainframes began to be challenged by the emergence of microprocessors. The innovations greatly facilitated the task of developing and manufacturing smaller computers - then called minicomputers. In 1976, the first microcomputers appeared whose costs represented only a fraction of those practiced by manufacturers of mainframes and minicomputers. The existence of the computer provided the conditions for the advent of the Internet which is undoubtedly one of the greatest inventions of the 20th century, whose development took place in 1965. At the beginning of the 21st century, cloud computing emerged, which symbolizes the tendency to place all the infrastructure and information available digitally on the Internet. Current computers are electronic because they are made up of transistors used in electronic chips that have limitations given that there will be a time when it will no longer be possible to reduce the size of one of the components of the processors, the transistor. Quantum computers have been shown to be the newest answer in Physics and Computing to problems related to the limited capacity of electronic computers. Canadian company D-Wave claims to have produced the first commercial quantum computer. In addition to the quantum computer, Artificial Intelligence (AI) can reinvent computers.
— In posing the question as to challenges to computing, we consider what will sustain it. That is, we ask if or when will computing and computers come to their end of innovative applications. This is not a discussion about bigger and faster machines. Of course, bigger and faster computers can and will push to new limits ordinary and well explored topics. This is ongoing and will continue for centuries. We are entered into a discussion about the use of computers to solve new, even revolutionary, problems of this world. Innovation is necessary for the simple reason that problems are becoming bigger, more complex, even wicked, and some apparently impossible.
This survey studied the past and current computing arena to predict the census of future trends in computing and glimpse into what the next era of computing might be. The survey appliance was developed and validated from reported and observed influence of various technologies, Internet use, product manufacture, software evolution, the technology exodus, Data in Transit, Data at Rest, Data in Use, the security technology and the performance evaluation and technology evolution. The survey instrument mainly was used to attain prophecy evidence about what's next in computing In addition to the advanced computing technology components starting to be found in elementary and forth.
Lecture Notes in Computer Science, 2010
The history of information technology is not the history of how wires got into boxes. Technological developments are intertwined in the social fabric, and their story includes the direct experience of individuals and the impacts felt by communities. Computers were once thought to be relevant only to specialists, but people today are more aware of the reach of computers into their lives. Similarly, the history of computing has traditionally been the focus of specialists in technology, but a greater variety of scholarly researchers is now studying archival collections about computing. The Social Issues in Computing Collection at the University of Minnesota's Charles Babbage Institute seeks to collect a wider array of perspectives on the industry and even to change the way people think about computing and archives.
2008
Permission to make digital or hard copies of portions of this work for personal or classroom use is granted provided that the copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise requires prior specific permission by the publisher mentioned above.
Boast, R. (2002) Computing Futures: A Vision of the Past. In B. Cunliffe, W. Davies and C. Renfrew (eds.), Archaeology: the widening debate. London, British Academy, pp. 567-592. A comparison of Virtual Reality as used in archaeology with the equally archaeological reconstructive project of 19th century Victorian Spectacular Theatre.
arXiv (Cornell University), 2022
As it is pretty sure that Moore's law will end some day, questioning about the post-Moore era is more than interesting. Similarly, looking for new computing paradigms that could provide solutions is important. Revisiting the history of digital electronics since the 60's provide significant insights on the conditions for the success of a new emerging technology to replace the currently dominant one. Specifically, the past shows when constraints and « walls » have contribute to evolution through improved technical techniques and when they have provoked changes of technologies (evolution versus breakthrough). The main criteria for a new technology or a new computing paradigm is a significant performance improvement (at least one order of magnitude). Cost, space requirement, power and scalability are the other important parameters.
Papers and discussions presented at the December 3-5, 1958, eastern joint computer conference: Modern computers: objectives, designs, applications on XX - AIEE-ACM-IRE '58 (Eastern), 1958
i MBA-1B (Morning) Acknowledgement I truly thank Almighty Allah for bestowing me with the courage and capacity to seek knowledge up to this level. Specifically to this project, my special gratitude to my teacher, Sir Muhammad Jahangir, without whose scholarly input and patronage, it would not have been possible for me to have carried out this research. To all my dear colleagues, for the encouragement, advices and suggestions of the work. I also stand indebted to my wife, who not only encouraged and stood by me every moment and time that I needed her support. Future of Information Technology Zulfiqar Ali Warraich ii MBA-1B (Morning) Dedication I dedicate this thesis to my mother for inspiring me to endeavour to stand and support this enormous project and valued task. And I believe in the faith of my father; for having the role model and the motivation to withstand this arduous journey of going through the various phases of the project. Future of Information Technology Zulfiqar Ali Warraich iii MBA-1B (Morning)
Eprint Arxiv Quant Ph 0503068, 2005
The purpose of life is to obtain knowledge, use it to live with as much satisfaction as possible, and pass it on with improvements and modifications to the next generation." This may sound philosophical, and the interpretation of words may be subjective, yet it is fairly clear that this is what all living organisms-from bacteria to human beings-do in their life time. Indeed, this can be adopted as the information theoretic definition of life. Over billions of years, biological evolution has experimented with a wide range of physical systems for acquiring, processing and communicating information. We are now in a position to make the principles behind these systems mathematically precise, and then extend them as far as laws of physics permit. Therein lies the future of computation, of ourselves, and of life. I. COMPUTATION
Personal and Ubiquitous …, 1997
The paradigm for future personal computing is changing.
International Journal for Research in Applied Science and Engineering Technology IJRASET, 2020
Within the paper, we tend to discuss this state of computing (AI) and its future opportunities. Throughout this publication, the aim would be to modify of these terms far more like AI, Deep Learning, Machine Learning, and Algorithms. If we've got to place it during a few words, AI is wherever machines will do what humans will do. Well almost thus, current systems don't have a way of knowingness. As we tend to progress through the study, we'll conclude the assorted levels of AI.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.