Iagon the evolution of computer technology.

in #crypto6 years ago (edited)

INTRODUCTION

The study of bulk power systems was one of the first technical fields seriously to court and attempt to harness the power of scientific computing. If we take a look at how the large computers such as Ferranti's Atlas and the
IBM 650s and 7090s were being used around 1960, we note the predominance of calculation codes for the nuclear industry, petroleum research, aeronautics and, right up among the leaders, electrical power systems.

In the following 30 years or so the power of the
computers available has grown by a factor of almost 10000. Other computing resources have kept pace with this development - memory capacity, data communication speeds, etc. - so that the tools we use today bear little or no resemblance to their forebears back in the
pioneering days of information technology. This
evolution gives rise to two questions. First, we wonder how the volume of work can have increased 10000-fold. Or, put another way, did we really do 10000 times less research in 1960?
Furthermore, while this evolution may seem extremely rapid, it still shows little sign of slowing down. Whence our second question, running parallel to the first: how will power system simulation be affected by a future 10 or 100-fold gain in computing performance? Will using this computing power present us with new problems at the very limit of our knowledge in fields as yet untouched by research? Or will we simply try to do what we do now, but more quickly and at less expense?
These questions arise in every field that makes use of scientific computing power, but more often than not the answers are quite simple. If we take, for example, mechanical engineering or materials research, we observe that this gain in computing power has so far been used to the full, and will continue to be in the future, as
scientists:

• Move from one-dimensional simulations to two and then three-dimensional simulations, and, with time dependent problems, we add one more dimension;
• Reduce the scale of the simulation from the entire mechanical part to the crack or even the grain, and then on to the level of the atom.
Neither of these future paths of development, however, can really be applied to power systems.
To be more precise, we note that the increase in the size of simulated systems (an increased number of nodes) has not, for some time now, been at all limited by scientific computing capacity. On the other hand, simulation in
time introduces a number of new scales, and we can imagine that, as computing techniques evolve, we will be able to solve problems with increasingly shorter time scales: from 'long-term dynamics' to 'electromagnetic transients'.

However, things have not developed exactly in that way, nor will they continue along these lines. Early simulation of networks and their associated generation systems quickly gave rise to three types of modelling that today co-exist and are being developed in parallel:

• an algebraic representation (load-flow calculations and unit commitment optimization) used to solve steady-
state or only slightly variable problems;
• an algebraic-differential representation (long-term dynamics, electromagnetic oscillation and transient stability);
• a fully differential representation (fast electromagnetic
transients).

Most obviously, computational power and computing architectures shape the speed of training and inference in machine learning, and therefore influence the rate of progress in the technology. But, these relationships are more nuanced than that: hardware shapes the methods used by researchers and engineers in the design and development of machine learning models. Characteristics such as the power consumption of chips also define where and how machine learning can be used in the real world.

Despite this, many analyses of the social impact of the current wave of progress in AI have not substantively brought the dimension of hardware into their accounts. While a common trope in both the popular press and scholarly literature is to highlight the massive increase in computational power that has enabled the recent breakthroughs in machine learning, the analysis frequently goes no further than this observation around magnitude. This paper aims to dig more deeply into the relationship between computational power and the development of machine learning. Specifically, it examines how changes in computing architectures, machine learning methodologies, and supply chains might influence the future of AI. In doing so, it seeks to trace a set of specific relationships between this underlying hardware layer and the broader social impacts and risks around AI.

What is IAGON?

download.png

IAGON is a platform for harnessing the storage capacities and processing power of multiple computers over a decentralized Blockchain grid.

With such platform revenue can be generated by sharing computer’s resources including processing power and storage to any individual or organization. Anyone can create their own smart contracts on an easy to use user interface without writing any lines of code and [Iagon] (www.iagon.com) would give the individual a granted token and this token can be converted to fiat Money.

Lets take a scenario, a low-profitable organization such as a public library has unused computing power and storage at a reduced cost and researchers in a high profitable organization requires data to make cross references about their activity, Iagon stands as an intermediary connecting both the public library and the researchers, thereby granting the public library tokens which can later be converted to fiat money. Computer systems are greatly used for e-transactions now, but most of socially and economically /technologically backward class are still struggling for it.
Sweden is the only country now has gone completely cashless.

Computer programs helps to perform calculations and maintain balance sheets in large numbers within seconds and it helps to keep track of the funds which are transferred and also can be monitored for security purposes like terror fundings, black money etc.

image (2).png

Economy is something which can be explained by an expert in economics in detail.
For businesses that require external cloud computing services, Iagon can serve as a lifesaver. Through Iagon , infrastructure and human capital needs can be greatly reduced.

The need for on-site computing power or costly cloud services is greatly reduced, as is the number of employees required. Instead, Individuals sharing the computer power and storage dictate the cost, rental period, and computer architecture they require for their project.

Here's my video entry:

For more information watch this short introduction:

Sort:  

This post has been submitted for the @OriginalWorks Sponsored Writing Contest!
You can also follow @contestbot to be notified of future contests!

Coin Marketplace

STEEM 0.18
TRX 0.15
JST 0.028
BTC 63597.74
ETH 2476.06
USDT 1.00
SBD 2.53