Forecast the future of mankind in the 21 st century Forecast the future of mankind in the 21 st century Forecast the future of mankind in the 21 st century

Andrei Kapatsy

Gods civilization
 

Prognostication of science and technique
development in the 21st century


 
 

    

 
  INTRODUCTION
[1] The first decade
[2] The second decade
[3] The third decade
[4] The fourth decade
[5] The fifth decade
[6] The sixth decade
[7] The seventh decade
[8] The eighth decade
[9] The ninth decade
[10] The tenth decade
  Conclusion

Forecast the future of mankind in the 21 st century

   

The first decade (2000 - 2010)

Decoding of the human genome. Creation of database of human genetic material. Streamline development of human comparative genetics. Work on comparison of separate genes and their groups and coded by them features. Definition of principal genes groups responsible for synthesis of the most important proteins of human. Creation of new class of medical products, aimed at genes normalization. Critical analysis of human genetic material. Theoretic basis of human genome optimization. Computer model of "sample" human genome. Perfection of genetic weapons. Realization beginning of global program "Human protein". The first computer models of sample genomes of agricultural plants. Influence of "computer selection" technologies on perspectives of farmery in the world. Growing of sample agricultural plants and animals. Flourishing of combinatorial chemistry and its appliance in pharmaceuticals industry. Renewal of medical products assortment. Theoretical basis of molecules development with tailor-made properties. Development and use of special purpose catalysts. Necessity of chemical interaction modeling on quantum level. Appliance of nanopipes by creating of electronic devices. Minimizing and power increase of personal and professional computers. Getting the first volumetric microcircuits by means of technologies of molecular assembly. Development process for industry of molecular robots. Investment grows in science and scientific branches of industry. First systems of volumetric computer visualization. Perspectives of computer modeling. Beginning of project realization called "common space of virtual modeling". The beginning of robotics introduction in human life. Creation of first samples for portable electronic translators. Creation and use of the first computer judicial programs.

After its triumph at the turn of the century, waiting for so long, and as a result of successful decoding of human genome, a golden century has come for genetics and a number of accompanying sciences. Primarily it was put in financing increase of theoretical and application scientific researches from the side of private capital which is the key point in perspective of new beginnings in science defining the time of financing begin without mistakes. Successful decoding of human genome has defined a tendency to priorities shift in science and technique development in side of biotechnologies and designing of organisms with set properties. Feeling the importance of current moment when in some months and years the foundations of future financial empires could be laid, the private capital began its mass movement towards a perspective scientific niche. Tens of thousands companies with different development, sphere and financial levels began to invest in some researches promising the excess profits in the nearest future. Hundreds of thousand scientists, managers, financial experts, politicians and simply businessmen were engaged in a new activity.

In a relatively short time other independent branches of science and industry showed their reaction on a rapid development of genetics. The powerful computers, ultra pure substances, special soft ware and perfect labware were presented on the world market soon. The official scientific institutions in the most countries of the world were not also outside. The biggest states tending to save their control at development of strategic branches of science, technique and industry did not scrimp on financing of perspective developments. In many countries official and secret programs of scientific researches corresponding the priorities of state politician line and economic potential were adopted. A new stage of technological separation for countries with high level of science development from the countries having weaker scientific and industrial potential. New technologies gave rise to new technologies.

Though in most cases each scientific and research group was working in condition of state or commercial secret, we could say about a collective universal attack on the secrets of human development, kept closed in genetic information. No one problem could stand such a mass attack. And, as a result, the problem began to give in opening one secret by another.

At first the results of decoding the genome of tens different people appeared. Herewith the objects for researches were humans having perfect, rare and abnormal morphological features. The understanding of realization mechanism for anomaly and individual differences is of high practical value for human genome study.

Great attention was paid to research and analysis of memory material of small ethnic groups' representatives, groups of local or isolated dwelling, and also individuals with extraordinary inborn abilities and features. At further analysis of received information attention was focused on such characteristics as human age, professional choice, availability of hereditary diseases, etc. With each coming year more perfective technology of human genetic information decoding encouraged constant cheapening of human genome research and extension of sphere for new possibilities and their practical appliance. At the same time, public database, which contained information about human memory material, was created on the servers of leading scientific centers of the world under pressure of public institutions and single scientists and politicians.

In the middle of decade the comparative human genetics smoothly outstood from the genetics and became the separate discipline. Comparative analysis and statistic processing of received data allowed to make important conclusions by means of comparing human genes and features coded by them. Mathematical methods of processing allowed also to compare separate genes and groups of genes with features set by them with pin-point accuracy. Scientists from different countries have made very many discoveries, using public information even without understanding the mechanisms of realization of gene-coded features, exclusively on the basis of correlations and direct comparison.

Firstly, genes and groups of genes responsible for synthesis of various human proteins were mainly defined. Secondly, there were mainly defined groups of genes, which carried information about time interactions of synthesizing between one another proteins in the process of organic evolution. Thirdly, group of genes with vague functions were defined.

Practical appliance of received data became definite recommendations for the most people, regarding their professional activities, choosing of place for living and way of life. Information which was received as a result of enlarged analysis of own genome, allowed human to live longer and more safety and vivid, applying on some life priorities, strict restrictions and stimulated the others, which are useful and favorable. In another words, a new consultant of healthy way of life has appeared called own individual genome.

Some discoveries quickly spread and became applied. In this way, creation of public data bases for genetic texts responsible for presence or absence of hereditary diseases made it possible to define hundreds of new targets in the human body for therapeutic intervention on gene's level. It is appeared, that a number of hereditary diseases could be treated by means of work normalization for separate gene or genes' groups in functional cells of the human body. General aiming for normalization of thick genes functions directly in the human body could be realized in theory in two principle ways.

The first one is the renewal of defect gene function by means of its remodeling, releasing, adding a missing part or replacing the whole active gene, etc.

The second way of treatment of hereditary diseases is the way of inhibition, the suppression of unhealthy function of thick gene. This way could be realized through gene destruction, its modeling, gene adding with neutralizing part, deactivation with other means and s.o.

Period of efflorescence in combinatorial chemistry resulting in pharmaceutical outburst, together with genetics researches led to creation of more then hundreds of new medicines, which allow to normalize the defect genes' work and treat effectively many of hereditary diseases. New means of transporting of medicine directly to the cell to definite genome place and even to definite gene were parallel worked out. Practical experience of the first such medicines use allowed to lay foundation of new class of medicines, aiming at gene normalization. Such "genes' normalizers" were planned to use in the nearest future not only for modeling the genetically born defects, but also for modeling of acquired defects for functional recovery of tissues and organs of the human body.

Procedure of genome decoding which became traditional by the end of the decade and practical experience of decoding the hundreds of genomes made it possible to perform the analysis of the human genetic material.

In spite of the remaining non-decoded milliards of individual genetic sequences, it was possible to isolate the most developed evolution practice which gave to human a good health, good abilities and active longevity. Gene correlation with inherited outside features to which refer the skeletal and muscular systems, harmonic appearance, optimal functioning of organs and tissues, were studied with high attention.

Intensively collecting information was sure to put on the agenda the question about as yet theoretical optimization of human genome. True information about genes with the best features demanded systematization and improvement. The best way to lead such a systematization was to make it within the already decoded human gene. Computer models of human genome, created at the last years, fitted for those purposes.

One of the variants of such computer models was chosen for carrying out a computer optimization of human genome. Every day and every hour renewal of this computer model was made by replacement of "bad" genes for "good" ones which coded information about favorable features in more quantitative its implementation.

In such a way, by the end of decade the computer model of "sample" human genome was formed. Greatness of that event could be compared with only the event of DNA disclosure sixty years ago. But unlike that period of time the scientific-technical potential of civilization, involved in solving the genetic problems, has become more powerful and could be aimed in short time on solution of the corresponding task. Associated power of human knowledge has become a huge strength defining further evolution of biosphere. The fact of computer model creation for "sample" human genome has withdrawn the question "What is to do?" for the genetics in the whole. Starting from that time, the perspectives of the genetics, as a science, were reviewed for the future decades and there was no one science in the world nearer to a human being.

Underside of the improvement process of "sample" genome model was a creation of computer model for "declining" genome that is such set of "bad" human genes, which could allow the human body to develop and live on the edge of death. The abovementioned computer model was of great value for studying the survival conditions of human being as species and included a number of useful discoveries which were often unexpected.

Negative result of emergence of new information about human genome was its active appliance for war purposes. Information availability about appliance of genes "normalizers" in medicine and also the publicity of genetic researches led to the fact that in secret labs a new-generation genetic weapon able to influence both on representatives of various civets, nations, professions, social groups and a definite person, was created. There was no weapon on the whole Earth the most dangerous. Real and direst danger of free expansion of death genetic material on the whole Earth has appeared. The scientists were the first who gave the alarm. Broad protest measure against genetic weapon which was of world-wide character, drew attention of world political elite to that problem and encouraged timely taking of preventing measures. World public protest made politicians and military men sign the prohibitive conventions and agreements and also working out measures of multipartite strict control. Earth civilization in its turn gained a new risky factor, namely, the possibility of sudden appliance of genetic weapon. The only positive moment, if it can be said so, was the fact that war labs designing the genetic weapon were working out also the means of fight with them, inclusive the protective means on genes' level that by declassification of war information was very interesting for official science.

Interpretation for genetic texts of human and other organisms of Earth biosphere has satisfied to some extend human's curiosity. At all events, it became clear with what words and on what language has the information about structure and development of the human body written in an human inherited material. It could be said for sure that many of those words described definite features and evolution stages of the human organism. Not more. Such questions as, how is information, given in human genome, transformed in living, performing metabolic activity, energy and self-replicating organism, have remained in most cases without reply.

But only full and irrefragable answers on these fundamental questions would allow the applied genetics to enter upon the realization of the task such as practical improvement of the human body.

Realizing the necessity of moving forward in scientific disciplines, studying the human, state institutions and private companies of the most developed countries began their autonomous researches which soon had joint into the whole-planet global program called "The human protein".

Fulfilling of that program rated for the term of thirty years was likely to lead, as it was awaited, to the following results:

  • defining of space structure of all proteins synthesized in the human body;

  • studying of functions of all proteins synthesized in the human body and also foreign proteins, used in metabolic reactions;

  • studying of forming mechanisms of the human body;

  • realizing of cause-effect relations gene - protein and protein – gene;

  • systematization of all biochemical reactions, realized in the human body;

  • realizing of mechanisms of fixation the chemical substances with proteins, synthesizing in the human body.

The final result of this program fulfillment was seen the three-dimensional interactive computer model of the human body, included all the received knowledge.

Awaiting considerable costs of labor and financial resources connected with program realization called "The human protein" did not frighten the world science and financial elite. Practical experience got by human's genome decoding has changed the perception of scope of humanity's possibilities and has also made a routine work with astronomic amount and objects and activities. Milliards of nucleotides, pixels, tens of milliards of molecules – dealing with such quantities has become accustomed in geneticists, programmers, industrial engineers, computer specialists and other experts' work. The humanity has got to the solving of sequent block of tasks.

At the general background of global interest to the human genetics some genetics' achievements in other spheres were not noticeable. But they were still effective. For instance, technologies used by human genome decoding were successfully applied in the process of genetic texts decoding for agricultural plants, animals, mushrooms and microorganisms. Methods of comparative genetics allowed creating computer models of sample genomes of many plants and animals in short period of time. Firstly, it concerned, of course, plants. Industrial methods of genome decoding allowed to receive quickly the genes' texts for tens of sorts of the same plant kind and by means of comparison to define the genome areas responsible for outstanding features. By the end of the decade it was a boom of "computer selections". As costs for these technologies were relatively not very high and the results were foreseen to be quick and effective, the representatives of middle business began investing in equipment and new industry. New strengths and additional capital gave a new power impulse for development of "computer selections" technologies. Heavy-productive and superstable yielder of agricultural plants in different climatic conditions on the Earth has become real.

Such high action of middle class representatives has frightened seriously the conservative farmers of the developed countries. Fear of being without work, to loose the planned income and, frankly speaking, fear of the family's future has led to mass struggles of farmers, creation of aggressive public organizations and lobbing the new law projects. Far sequences of mass appliance of "computer selection" technology were seen a change of priorities in choosing the agricultural plants, increasing of unemployment level in farmers sphere and farming enlargement.

Use of the newest technologies by creating the improved sorts and cultures of mushrooms and microorganisms for the purpose of using in processing and food industry had also gained a wide spread occurrence. The abovementioned developments, though requirement of high carefulness and detailed tests, have surely led to considerable improvement of food quality that was always the primary task in priorities of food industry development.

In reference to practical collecting of "sample genome" of the most important agricultural animals and its further growing on the basis of sample animals, there were considerable difficulties in that respect. Academic studying of genes' texts and creation of computer "sample genome", that is those things to which the public opinion was tolerant, were not able to give the practical results quickly because of impossibility for checking the theoretical results in practice. Even by favorable attitude to growing of sample animals on the part of society, the results could not be received so quickly because of continuous process of animals' growing. However, in real life public opinion, tolerant to tests with plants, had negative attitude towards genetic tests with animals. The publicity found such tests as being analogical to those with people.

The first decade of the new century was a period of flourishing the combinatory chemistry which was a science branch led to perfection an extensive method of attempts and mistakes by creating the useful chemical compound. Traditional search of chemical compounds structure was of industrial scope. Synthesis of chemical compounds for order has become one of the priority-oriented line of business investing in science and one of the main line of scientific institutions activity which was chemical-oriented. Methods of combinatory chemistry allowed to increase labor productivity of chemical experts dealing with synthesis of new compounds in hundreds and even thousands times. A typical specialist needed nearly a week to find an substance, having required features, in its first approximation by using traditional methods and in case of combinatory chemistry its technologies allowed to synthesize approximately a thousand substances-candidates at the same period of time.

The appeared new task about how to choose the best and the most effective substances-candidates from the synthesizing ones was scale solved on the basis of industrial approaches. Not each substance separately, as it was earlier, but simultaneously hundreds and thousand of got substances-candidates were checked on biological activity, while the pharmacy was the main buyer and customer of products of combinatory synthesis. Such industrial technologies allowed to increase a thousandfold the number of studied substances in pharmacology and, as a result, to decrease costs for seeking of new medical products and to increase the effectiveness of newly obtained. But still the selected substances-candidates were undergone the continuous and detailed testing before receiving a free pass. In other words, the industrial approach did not work at the final stage of testing the synthesized chemical compounds when their testing on biological activity. The full-scale testing on the complicated biological objects was still required. Studying the new product activity in live organism and also the far consequences of its taking required, as earlier, the considerable financial costs.

Technologies of combinatory chemistry have laid the foundation for forming a new science dealing with developing of chemical compounds with the set features, or simply for chemical developing. Honorable and loyal attitude from the side of the society was first guaranteed to that science even only because of the fact, that the human being itself is a product of chemical development in the course of natural evolution. Notwithstanding its disadvantages the combinatory chemistry has made a considerable contribution in evolution of the chemistry as a universal science. The computer modeling of the process of some chemical compounds interaction has become a popular branch of investigations. Such knowledge was basic and was required as a theoretical basis for development of chemical compounds with set features. The full-scale tasks stood before the scientists showed the insufficiency of existing knowledge about the interaction of some chemical compounds. The reliable chemical products could not be received without the definite and clear understanding of all the nuances of chemical compounds interaction.

Creation of reliable materials with set features had to increase cardinally the labor productivity on the whole planet also to solve many of ecological problems and to create the optimal conditions and possibilities for people living.

Good perspectives have caused a financial income flowing into the theoretical researches of interaction of some chemical compounds and also in other branches of science and industry. A number of state and private enterprises and scientific institutions in different countries have concentrated their efforts on detailing the theoretic matters about interaction of chemical compounds and on the developing of the software.

During the period of time when the world science and the progressive business were studying the new scientific economic niches, the technologies of the combinatory chemistry which were really working had led to the renewal of medical products assortment in the world at the period of the first decade of the new century more than on two thirds. The new medicines were more effective, more safe and cheaper than their predecessors, often in several times.

Besides the pharmacology the developments of the combinatory chemistry were actively applied by development of protection means of plants, creating of vet products and utilization of chemical substances. General tendencies of perfection the chemical technologies demanded a considerable decrease of energetic costs for piece of product, increase of cleanness and quality level of the synthesizing compounds, waste free production and environmental safety. On practice these tendencies were expressed in necessity of temperature and pressure decreasing at which the chemical reactions were carried out, increasing of using coefficient of the materials and reagents. It was seen possible to fulfill all those requirements only by applying the developed catalytic substances. Understanding this fact from the side of high-technologies oriented business has led to the financial inflow into the academic and applied researches concerning the mechanism of catalytic substances' activeness.

Necessity of creating the high-developed catalytic substances demanded the development of software for computer modeling of interaction process of two and more chemical compounds. The difficulty of this very problem was that the modeling of perspective catalytic chemical reactions with the help of power computers, to be exact the only such approach allowed creating the perfect catalytic substances, was slowing down because of some objective causes. The main one was insufficient power of existing super computers that did not allow carrying out the required calculations when solution of the equations of quantum collision theory taking into account the particles redistribution. There were also some other reasons explained by non-fullness of theoretical vision in a number of corresponding scientific disciplines and also by absence of the required production technologies. In short, further moving forward required complex approach to problem of computer modeling the chemical reactions, which included the detailing of theoretical knowledge, creation of effective software and possibility of concentration on problem solving regarding the computer power. But at that moment the science did not receive any possibilities for definite modeling of chemical reactions. In practice methods of computer modeling with one or another degree of near to reality were applied.

Questions of modeling the interaction of some chemical compounds were the key ones for development of a number of technologies. The nanotechnologies had also the attitude to this. The first results of joint work of many scientific groups appeared to the middle of decade. The first bulk electronic schemes, consisting of active elements of sizes, bigger than the ones of single molecules, were produced and tested in practice. Such electronic schemes were made of the same chemical elements and their compounds which were earlier applied in electronics that is using the semiconductors and the materials with high electric conductivity. The new thing about all this was the real size decrease of active elements. Quantum substance features and quantum effects have become the determinant factors at the stage of transferring to nanoscales. Electron was described by that as a wave function, and expansion of work signal in a substance was determined by imperforation, tunnel effect through potential obstacles and quantum restrictions. Bulk electronic schemes allowed reaching their operation frequency up to trillion Hertz.

At the same time in many laboratories of the world the experiments of projecting and producing the electronic devices on the basis of carboniferous nanotubes were carried out.

At close examination those carboniferous nanotubes appeared to be a perspective material for electronic industry. Dependability of nanotubes electrical features from their geometrical parameters made it possible to receive active elements with metal or semiconductive features and also to interlace the spaces with metal and semiconductive conductibility in limits of a separate active element. The most important feature was also the possibility of using the nanotubes as the joint lines in bulk electronic schemes.

Further computer evolution was going in a traditional, for the last fifty years, way. Processors appeared to be more powerful and micro schemes became of the smallest size. Computer power consumption was decreasing and their speed was increasing. Those tendencies tended not only PC but also the super computers. The purpose to which the computer developers aimed was reaching the power just the same of the power of human brain, and this aim seemed to be very near. With the help of photolithographic technologies of producing the perfective integral micro scheme, it became possible to produce the single computers able to do operations of ten in thirteen degrees per second. (10 Teraflop). Considering the fact that the human brain is able of doing operations of ten in sixteen – seventeen degrees per second, it seemed that the desired aim was nearly achieved. However, taking into account that the super computers were not the single processor with which the human brain could be compared, but hundreds and thousands of separate processors, performing parallel work, it could be stated that the single computer was behind the human brain in ten millions times.

At such parameters of computer power the physical limit of photolithography technologies was reached. That limit restricted further minimizing of planar micro schemes. To become near to structural decisions realized in the human brain it was required to reduplicate in electronic schemes the bulk (neuron) schemes peculiar to the human brain. The required schemed design in metal cannot be realized according to previous technology. The existing level of knowledge has remained the only way for computer development and that way was a creation of active elements and micro schemes with the help of nanotechnologies.

Technological methods actualized in laboratory conditions allowed the creation of active elements of molecular size micro schemes and arranged them in multiple-level space schemes. In practice all that was realized by creating single samples of not complicated micro schemes not fit for applying in real computers and mainly characterized the potential of nanotechnologies. Though the technologies of molecular arrangement were good activated in laboratory conditions for the purpose of their industrial appliance it was required to overcome the difficulties, which were of quantitative and temporal character. It was necessary to do a lot for creation of production and instrumental basis, development of appropriate branches and all this required much time. To receive a competitive micro scheme it was needed to arrange in strict order tens of million molecules of some chemical compounds. By this the fact of non-correct arrangement of only some molecules could lead to the considerable reducing the quality of the product. It was clear, that no one even the most hard-working specialist having microscope and device for single molecules replacement was not able to arrange even with high accuracy the one space micro scheme. To complete such work scope the mechanisms able to manipulate separate molecules carefully replace them and install in the definite place, were required. Taking into consideration the mini sizes of operating field and object size by realization of molecular design, it was definitely necessary to use the designing mechanisms for molecular designing, having parameters compared with molecules sizes.

Molecular robots which were the necessary instrument for new designing technologies were to be designed and constructed. Time was required for preparation of complicated production basis for molecular robots output and their further industrial appliance. Production of space micro schemes required fitting of big quantity of existing high technologies and creation of some new ones. That was the main difficulty. From the other side the mass attack on one of key problems of present time has caused plenty of positive additional effects.

Terms limitation, defined for solving the number of additional problems, has led to work arising at many small and specialized branches. Perspective researches began to attract the serious investors, that was the reason for increasing the discovery rate and has led to intensive scientific activity in financing branches.

A considerable development impulse has come to the world industrial production in general. Production of the purest chemical elements and their compounds, creating of systems of automatic production control, quality control, designing and production of new kinds of industrial equipment, all these is not completed list of production branches, which were involved in the process of reconstruction and re-equipment on the stage of transition to the technologies of molecular designing of space micro scheme.

It is worthy to note that molecular robot-designers were near to their analogues which were catalytic substances as per their functional intend and that were widely represented both in the chemistry of organic synthesis and natural alive objects. Natural and artificial catalytic substances could be called molecular robot-designers as they were engaged in the process of molecular designing of complicated organic molecules. Molecular robot-designers, which were designed for the purpose of molecular design of space micro schemes, could, in their turn, be defined as specialized catalytic elements for development of specific molecules. In another words, they were defined as specialized catalytic substances for producing the processor substance. Such similarity was like a real unity of these two sciences and reflected their unity and universality of their common laws of world making. This example characterized common tendency of joining into one science such different disciplines as the chemistry, the cybernetics, the biology, the physics of micro world and some others. That tendency became more and more noticeable by transition to the technologies influencing the molecular and atomic levels of matter structure.

Mass attacking the problem of creation the space micro schemes gave the positive results. By the end of the decade the first volume processors with space micro schemes and operating frequency of milliard Hertz appeared. By that, the micro schemes quite fit for mass appliance in computers were produced according to the technologies of molecular designing.

At the beginning of the new decade one more considerable event has happened which concerned the computer future and the human future in general. Computer technologies development and also the researches in the sphere of electronic industry allowed the holography to make the forward step to the quite new level and to be in the same row with the newest technologies, such as the nanotechnologies and gene engineering.

As it used to be in a science history, fast simultaneously the computer systems of space visualization using the principles of holography, were worked out in several countries in scientific laboratories, mainly in military ones. It is worth stating, that the holographic imaging coping is performed by joining (interference) of two laser beams. By this one of the beams is to be reflected from the copied object. Then, the second laser beam, joining together with the reflected one, gives an interference picture fixed on the photofilm. A ready-made picture at its lightening with the laser beam allows to receive a volume imaging of the copied object. This very imaging is visually behind the surface.

It became possible to receive a volume object imaging hanging in air (an integral part of fantastic films) only after appearing the technologies of control of each single imaging element. The single imaging element, hanging in air, that is the spot, has been created as a result of two laser beams joining, one of which was the main (basic) and the other was reflected from a liquid crystalline element able to make optical delays. Such a scheme made possible to receive a single imaging element in the air and, moreover, to replace it in space. The sum of liquid crystalline elements able to change the coefficient of refraction under action of electromagnetic field made possible to form in space the equal-phase surfaces or the wave fronts that visually seemed to be the surfaces of the represented object.

Possibility of receiving the qualitative space images changing in time and being in the air, has arisen the perspectives of using the systems of volume visualizing in the most different field of science and technique and has become a real evolutional research concerning all spheres of human existence. Creation of space computer models of occurrences, processes, things and living bodies was seen a reality in the nearest future and the perspectives of interactive computer modeling were seen on the not far line. Possibility of true computer objects modeling of animate and inanimate nature and also the occurrences, effects, processes complicated and actuarial, will become a human genius halo and take place among high technologies. High level of earth science and technique characterized by existence of a number of true computer objects and occurrences models, as it was awaited, will encourage transition of human civilization on hither development level. Energetic and resources expenses required for optimal functioning and development of scientific researches, industry, agriculture and some others will be reduced by this considerably.

During the first decade a number of complicated and major tasks have appeared before the humanity. They included: designing of chemical substances with set features, optimization of the genome of the human and other organisms and also long-term economic, political and social forecasting. Such tasks could not be effectively solved without using the true computer models. Those tasks with thousands and millions constituents, joint together by cause-effect and time relations, could be solved only by means of selecting the computer variants in real time mode. Experience of using the computer models for the purpose of solving the definite task was unacceptable for solving the complicated and new tasks. The first reason for this was the complexity and originality of the tasks themselves. The second reason was absence of visualization in the process of receiving the intermediate results. The third reason was the fact that the computer modeling, in its near perspective, must concern all aspects of the human life. That's why some kind of common basis, unique and universal set of rules for unified tasks modeling in the most various spheres of human activity, was required. In another words, life itself demanded the creation of universal method of computer modeling and new instruments for realization of such method.

Science and technique of the first decade at their possibilities limits were at the beginning level of objectivation the universal method of computer modeling. The theoretical basis of common, multiple-level virtual space, in which the task solving connected with the development of true computer models was planned to be carried out, were founded. At the point, it was planned to bring the ABC-standard for constructing the computer models. Such achievements of the last years as creation of volume computer visualization system, power computers, perfected software, all scientific knowledge about structure and material functioning principle encouraged the technologies of computer modeling in the nearest future taking the first place among high technologies. In the nearest future, as it was awaited, the processes of industrial technologies development, new substances designing, checking scientific theories and many other activities with be carried out by means of computer modeling. By this, the process will be visual, being as a row of changing volume images undergone interactive correction.

Robot-technique and producing the robots during the first decade of the new century have also made a meaningful step forward. Being usual the usage of robots in technological processes was grasping more and more production branches. It turned to be so natural and conflictless that it was not noted by the publicity. Sphere of robots appliance expanded with each coming year being not only in production field but also in another spheres of human activity. Robots were the best in the process of assembling in the most various branches of machine-building. Hard-working employees and controllers working without mistakes and more effective than people they were cheaper for an employer to have such robots on the production line than to have the salaried employees.

A number of applied production robots have strict software, directing their activity. But in some technological processes the AI-robots began being applied. Corresponding software allowed them to perform the complicated functions in multiple-factors restrictions space. AI-robots were used for production controlling mainly by appliance of technologies without people that is in those fields where it was impossible to foresee all possible negative situations. For example, in chemical production such robots, based on the data of devices and general knowledge about the technological process, could foresee the possibility of alarm in one or another place, timely switch on the reserve power and call for repair specialists for defect removal.

Power microprocessors appearing and creation of qualitative executive mechanisms allowed the companies producing robots to penetrate into the human's every day life. At the beginning the offers for selling robots performing household functions were taken by the society very carefully and doubtingly. But with time passing robots have become the integral part of the human life. Robots performing guarding and controlling functions were the most demanded. Such robots were able to control the dwelling condition with the owners being absent. They also carried out some repair functions, such as turning off the gas and water inlet, window closing in conditions of bad weather. The intellectual functions of robots were expressed in their ability to make right decisions by changing the inside temperature, in case of alarm condition of electric devices, wrong positioning of the front door and windows and inadequate behavior of pets. Robots themselves could call for needed services in case of fire, attempts of inside penetration and gas and water leakage, etc. They were able by that to differ the floor, walls or furniture temperature rise at the places warmed by the son from those being dangerous in places of electric devices connections. Step by step, the household robots became the usual element of human private life together with pets and information means.

By the end of the first decade a considerable success was reached in the sphere of software development. Particularly, the universal programs for working with super big data basis, which could be equally effectively used in many spheres of human activity, were created. The main sphere of software appliance were the scientific-researches works in genetics, biochemistry, chemistry, sociology and also test modeling for high energetic weapons. Appliance of new developments having public meaning in every day life reflected in creation of computer translation means.

The first samples of WIA electronic translators were the devices consisting of the system of acoustic sound recognition, pronounced by a person, a power processor and a system of sound, words and often used sentences reproduction. Such translators allowed people who do not know the language of his collocutor, communicate with each other quite effectively. About seventy percent of sounds pronounced by a human were correctly identified by such mechanisms. Program correcting filter made corrections on the stage of translation from ten up to fifteen percent of perceived sound material. Thank to qualitative software the first models of electronic translators allowed the collocutors to understand each other perfectly. It concerned mostly not difficult every day communication. For professional communication the electronic translators able to identify special technical and scientific terms, were used. Professional translation was not effective enough while correct identification of pronounced terms was not more than sixty percent. A habit of discussing the problem from different sides by the specialists assisted in good understanding level by their communication. Software allowed automatic and manual tuning by perceiving the key words and terms, inclusive slang ones, to do effective sound abbreviations. In brief, some specialists speaking different languages could finally, after the corresponding tuning of their electronic translators, understand each other correctly.

Appearing of portable electronic translators made a beginning to some social processes. Particularly, a number of people who don't know foreign languages but traveled abroad independently have increased. Representatives of small and medium business, students and artists were no longer in need in expensive translators' services. Feeling themselves comfortable in multilingual surrounding, they were able to sell their products, get knowledge or make themselves acquainted with landmarks at minimal costs. Negative feature about all this was the irrelevance of the profession of translator and positive one was reactivation of international relations and trade, small business and rapprochement of mentality of people of different nationalities. Technologies of computer translation, being outside the limits of scientific laboratories, widely expanded. Being very advantageous in sphere of servicing, starting from restaurants and bars and finishing by the stationary installments in the airports and places of cultural pilgrimage, electronic translators functioned as information bureau instead of guides and waiters, etc.

One more outstanding event of the decade was a creation and application of the first samples of judicial programs which helped the judges to award. Such programs were widely used in countries having the case law. Really only by computer analysis and search it was possible to take into account all nuances and peculiarities of tens of million of judicial proceedings, to find more successful judicial decisions of the previous decades and centuries and to place for consideration variants of possible judicial decisions. Of course, final decision was made by a person. But still the first experience of application the computer judicial programs in the USA has definitely showed than more than a half of recommended judicial decisions were making by the judges without any alterations as being true. Primarily, the judicial programs were applied at considering every day, administrative delinquencies and not complicated criminal offences. Their appliance encouraged making an electronic form, quite convenient for its processing by means of software, from all the judicial decisions accumulated by hundreds of years.

Wide application of judicial programs generated in society the discussions the topic: "Can a computer fully substitute a human being by making the judicial decisions for all kinds of delinquencies and offences". Creation and practical application of judicial programs have led to some social consequences, as they infringed the interests of practicing lawyers, judicial colleges and state officers.

  Top  

<<<     The first decade (2000 - 2010)     >>>