Презентация на тему: " Lecture#10 Concluding session, part II The Bonch-Bruevich Saint-Petersburg State University of Telecommunications Series of lectures Telecommunication." — Транскрипт:
Lecture#10 Concluding session, part II The Bonch-Bruevich Saint-Petersburg State University of Telecommunications Series of lectures Telecommunication networks Instructor: Prof. Nikolay Sokolov,
New problems concerning throughput Last century: We had to have 3.4 kHz for telephony (F1), 15 kHz for sound broadcasting (F2), and 8 MHz for TV broadcasting (F3). So, total bandwidth with N1 channels for telephony, N2 channels kHz for sound broadcasting, and N3 channels for TV broadcasting can be calculated by the following formula: N1xF1+ N2xF2+ N3xF3. Current century: We have to have 64 kbit/s for telephony (B1), from 64 kbit/s to 2 Mbit/s sound broadcasting (B2), from 2 Mbit/s to 30 Mbit/s for TV broadcasting (B3), and from from 2 Mbit/s to 100 Mbit/s for data transmission (B4).
Definitions related to QoS In the Recommendation E.800 and in a number of other ITU-T documents several similar definitions of the term "Quality of service" are formulated: 1. Totality of characteristics of a telecommunications service that bear on its ability to satisfy stated and implied of the user of the service (E.800). 2. The collective effect of service performance which determine the degree of satisfaction of a user of a service. It is characterised by the combined aspects of performance factors applicable to all services, such as; - Service operability performance; - Service accessibility performance; - Service retain ability performance; - Service integrity performance; and - Other factors specific to each service (Q.1741). 3. The collective effect of service performances which determine the degree of satisfaction of a user of the service (Y.101). 4. The collective effect of service performance which determine the degree of satisfaction of a user of a service. It is characterized by the combined aspects of performance factors applicable to all services, such as bandwidth, latency, jitter, traffic loss, etc (Q.1703).
Problem of the transition to NGN PSTN Operators should find viable strategy of the transition to NGN, which provides protection of investments in circuit- switched technology. Source: B. Jacobs. Economics of NGN deployment scenarios: discussions of migration strategies for voice carriers. – It is necessary to combine PSTNs quality of service and IP technologies economic efficiency!
QoS aspect: time irreversibility Speech quality impairment compensation in networks with circuit switching: Elaboration of the new speech signal processing algorithms; Signal amplification (when necessary). Speech quality impairment compensation in IP networks under condition of the excessive packet transfer delay: Impossible in principle!!!
Forecast (XV century) The time will come when people from the most distant countries will speak to one another and answer one another. Leonardo da Vinci
Inaccurate Predictions This telephone has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us Western Union internal memo, 1876 I think there is a world market for maybe 5 computers Thomas Watson, Chairman of IBM, 1943 There is no reason anyone would want a computer in their home Ken Olson, President, Chairman & Founder Digital Equipment Corporation, K ought to be enough memory for anybody Bill Gates, Microsoft, 1981 Conclusion: Prediction is very difficult, especially if it's about the future. (Niels Bohr, Nobel Prize in Physics in 1922.).
Jipp curve (1) Jipp curve is a term for a graph plotting the number (density) of telephones against wealth as measured by the Gross Domestic Product (GDP) per capita. The Jipp curve shows across countries that teledensity increases with an increase in wealth or economic development (positive correlation), especially beyond a certain income. In other words, a country's telephone penetration is proportional to its population's buying power. The relationship is sometimes also termed Jipp Law or Jipp's Law. The Jipp curve has been called "probably the most familiar diagram in the economics of telecommunications". The curve is named after A. Jipp, who was one of the first researchers to publish about the relationship in The number of telephones was traditionally measured by the number of landlines, but more recently, mobile phones have been used for the graphs as well. It has even been argued that the Jipp curve (or rather its measures) should be adjusted for countries where mobile phones are more common that landlines, namely for developing countries in Africa.
Classifications of clients (2) Source: Telcordia Technologies
NGN as economical solution Increase of communication Operators revenues is possible by solving of two important problems. Firstly, independently or with assistance of services Providers, it is expedient to take over another niche, implicitly related to telecommunications business. The cases in point are information services which, in the long run will provide increase of Operators revenues. Secondly, revenues increase can be achieved when minimizing expenses. In this instance the matter concerns optimal ways of infocommunication system development and perfecting of maintenance processes. Efficiency of these processes determines, to a great extent, the level of Operational expenses on the system management. NGN concept – from the economic point of view can be considered as fulfilment of new requirements of potential clients at the expense of comparatively slight increase of CAPEX with essential decrease of OPEX.
Methods of the forecasting (1) Genius forecasting – This method is based on a combination of intuition, insight, and luck. Psychics and crystal ball readers are the most extreme case of genius forecasting. Their forecasts are based exclusively on intuition. Science fiction writers have sometimes described new technologies with uncanny accuracy. There are many examples where men and women have been remarkable successful at predicting the future. There are also many examples of wrong forecasts. The weakness in genius forecasting is that its impossible to recognize a good forecast until the forecast has come to pass. Some psychic individuals are capable of producing consistently accurate forecasts. Mainstream science generally ignores this fact because the implications are simply too difficult to accept. Our current understanding of reality is not adequate to explain this phenomena.
Methods of the forecasting (2a) Trend extrapolation – These methods examine trends and cycles in historical data, and then use mathematical techniques to extrapolate to the future. The assumption of all these techniques is that the forces responsible for creating the past, will continue to operate in the future. This is often a valid assumption when forecasting short term horizons, but it falls short when creating medium and long term forecasts. The further out we attempt to forecast, the less certain we become of the forecast. There are many mathematical models for forecasting trends and cycles. Choosing an appropriate model for a particular forecasting application depends on the historical data. The study of the historical data is called exploratory data analysis. Its purpose is to identify the trends and cycles in the data so that appropriate model can be chosen.
Methods of the forecasting (2b) The most common mathematical models involve various forms of weighted smoothing methods. Another type of model is known as decomposition. This technique mathematically separates the historical data into trend, seasonal and random components. A process known as a "turning point analysis" is used to produce forecasts. ARIMA models such as adaptive filtering and Box-Jenkins analysis constitute a third class of mathematical model, while simple linear regression and curve fitting is a fourth. The common feature of these mathematical models is that historical data is the only criteria for producing a forecast. One might think then, that if two people use the same model on the same data that the forecasts will also be the same, but this is not necessarily the case. Mathematical models involve smoothing constants, coefficients and other parameters that must decided by the forecaster. To a large degree, the choice of these parameters determines the forecast.
Methods of the forecasting (3a) Consensus methods – Forecasting complex systems often involves seeking expert opinions from more than one person. Each is an expert in his own discipline, and it is through the synthesis of these opinions that a final forecast is obtained. One method of arriving at a consensus forecast would be to put all the experts in a room and let them "argue it out". This method falls short because the situation is often controlled by those individuals that have the best group interaction and persuasion skills.
Methods of the forecasting (3b) A better method is known as the Delphi technique. This method seeks to rectify the problems of face-to-face confrontation in the group, so the responses and respondents remain anonymous. The classical technique proceeds in well-defined sequence. In the first round, the participants are asked to write their predictions. Their responses are collated and a copy is given to each of the participants. The participants are asked to comment on extreme views and to defend or modify their original opinion based on what the other participants have written. Again, the answers are collated and fed back to the participants. In the final round, participants are asked to reassess their original opinion in view of those presented by other participants. The Delphi method general produces a rapid narrowing of opinions. It provides more accurate forecasts than group discussions. Furthermore, a face-to-face discussion following the application of the Delphi method generally degrades accuracy.
Methods of the forecasting (4) Simulation methods – Simulation methods involve using analogs to model complex systems. These analogs can take on several forms. A mechanical analog might be a wind tunnel for modeling aircraft performance. An equation to predict an economic measure would be a mathematical analog. A metaphorical analog could involve using the growth of a bacteria colony to describe human population growth. Game analogs are used where the interactions of the players are symbolic of social interactions. Mathematical analogs are of particular importance to futures research. They have been extremely successful in many forecasting applications, especially in the physical sciences. In the social sciences however, their accuracy is somewhat diminished. The extraordinary complexity of social systems makes it difficult to include all the relevant factors in any model.
Methods of the forecasting (5) Scenario – The scenario is a narrative forecast that describes a potential course of events. Like the cross-impact matrix method, it recognizes the interrelationships of system components. The scenario describes the impact on the other components and the system as a whole. It is a "script" for defining the particulars of an uncertain future. Scenarios consider events such as new technology, population shifts, and changing consumer preferences. Scenarios are written as long-term predictions of the future. A most likely scenario is usually written, along with at least one optimistic and one pessimistic scenario. The primary purpose of a scenario is to provoke thinking of decision makers who can then posture themselves for the fulfillment of the scenario(s). The three scenarios force decision makers to ask: 1) Can we survive the pessimistic scenario, 2) Are we happy with the most likely scenario, and 3) Are we ready to take advantage of the optimistic scenario?
Methods of the forecasting (6) Decision trees – Decision trees originally evolved as graphical devices to help illustrate the structural relationships between alternative choices. These trees were originally presented as a series of yes/no (dichotomous) choices. As our understanding of feedback loops improved, decision trees became more complex. Their structure became the foundation of computer flow charts. Computer technology has made it possible create very complex decision trees consisting of many subsystems and feedback loops. Decisions are no longer limited to dichotomies; they now involve assigning probabilities to the likelihood of any particular path. Decision theory is based on the concept that an expected value of a discrete variable can be calculated as the average value for that variable. The expected value is especially useful for decision makers because it represents the most likely value based on the probabilities of the distribution function.
Methods of the forecasting (7) Combining Forecasts It seems clear that no forecasting technique is appropriate for all situations. There is substantial evidence to demonstrate that combining individual forecasts produces gains in forecasting accuracy. There is also evidence that adding quantitative forecasts to qualitative forecasts reduces accuracy. Research has not yet revealed the conditions or methods for the optimal combinations of forecasts. Judgmental forecasting usually involves combining forecasts from more than one source. Informed forecasting begins with a set of key assumptions and then uses a combination of historical data and expert opinions. Involved forecasting seeks the opinions of all those directly affected by the forecast (e.g., the sales force would be included in the forecasting process). These techniques generally produce higher quality forecasts than can be attained from a single source. Combining forecasts provides us with a way to compensate for deficiencies in a forecasting technique. By selecting complementary methods, the shortcomings of one technique can be offset by the advantages of another.
Example of Delphi technique The question is: how many Y-terminals will be installed up to 2000 year? 1.Ten experts sent the following estimations: 1 million (5 opinions), 1.2 million (3 opinions), 1.4 million (2 opinions). 2.Mean value is 3. Variance is 4. Coefficient of variation is Conclusion: forecast is stable.
Dependability Strictly speaking, dependability should be considered as one of quality aspects. Nevertheless, some specialists consider dependability as an independent term that has the same status as the quality. The dependability is the property of an object to retain, in a course of time, within specified limits values of all parameters, which characterize capability to perform required functions in predetermined for that object regimes and conditions of application, technical maintenance, repairs, storage and transportation. It is obvious, that there is no sense in speaking about objects dependability during the time periods, when it is withdrawn from operation for execution of scheduled inspections, modernization and other procedures.