Cognitive Systems - Information Processing Meets Brain Science

Free download. Book file PDF easily for everyone and every device. You can download and read online Cognitive Systems - Information Processing Meets Brain Science file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Cognitive Systems - Information Processing Meets Brain Science book. Happy reading Cognitive Systems - Information Processing Meets Brain Science Bookeveryone. Download file Free Book PDF Cognitive Systems - Information Processing Meets Brain Science at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Cognitive Systems - Information Processing Meets Brain Science Pocket Guide.

I know the report is damning, and it may be based upon solid evidence, but how sure are we? We must allow for that uncertainty in our thinking. Defence-in-depth fallacy. Fallacy on the part of computing science experts who entertain the idea that graceful degradation of automated information processes fault tolerant architecture shall be fail safe as the automated system is designed to stop the process under control if in doubt over the data how the computer will doubt its input data, its own logic, and the outputs it generated if it has no access to real world like human beings is not questioned?

The transport programme has large benefits and no major costs. I suspect the affect heuristic. The great and good in the company are agreed with the programme mission and they like their plans.


  • THIS IS SOME TITLE.
  • The Handbook to Jewish Spiritual Renewal: A Path of Transformation for the Modern Jew.
  • THIS IS SOME TITLE!

I suspect Affect and satisficing heuristic and planning fallacy [ 36 ]. Narrative fallacy. The engineer has fallen for a narrative fallacy. Out of mind out of sight bias. The fault tree and event tree analysis of the train crashes do not show any management and technical errors. Blame Game. The train failed in the tunnel. The communcation between the trackside and train-based equipment did not take place in the degraded scenario due to operator error.

The computer simulation did not test this scenario. Clear-cut information about the probablity of an event is not taken into account because people believe that chance is a self-correcting process, such that a deviation in one direction will necessarily be followed by a deviation in the opposite direction.

The expected value of the game as a sum of the product of probability of loss or gain multiplied by the values of the outcomes considered by the decision maker s or taker s is poor psychology is noted by Nobel Laureate, Daniel Kahneman. These type of erroneous arguments can be seen in the case of level crossings [ 25 ]. Illusion of Invincibility bias. The supplier has announced a new train protection system which is designed to be fail safe and uses mutliple but redundant channels for information processing.

Full moon effect on information processing is not recognized. The supplier has furnished us the risk register for the anticipated risks. The hazard mitigation method is noted by the domain experts, but the method of hazard control is insufficent for the risk. Allais paradox: Norms of Expected Utility Theory and axioms of Rational Choice were violated due to certainty effect by expert statisticians and future Laureates in Economics in the following decision situation is cited by Laureate economist Daniel Kahneman pp. The train driver has the ultimate responsibility for the safety of the train and passengers and has to comply with signal commands.

We have robust systems for recruiting, training, developing, and certifying the train staff.

The EPA National Library Catalog | EPA National Library Network | US EPA

Our operating rules and regulations are robust. The train drivers can handle the emergency and normal situations with cognitive work load. Our experience has shown that signalling systems are functioning correctly after the accident. Human intuitions are prone to errors and mistakes. Let us automate the train driving task. Instrumentalism Fallacy. Give a small boy a hammer, and he will find that everything he encounters needs pounding. I suspect instrumental fallacy. We must be careful of the lack of connection between geometry and physics.

Measurement Fallacy. Risk not measured is not managed. Let us quantify the risk of rare events according to Poisson method and justify that it is acceptable as the greater good of the society is served by ignoring so-called wider human factors. I suspect Measurement Fallacy. Concrete Jungle Fallacy. European and American city dwellers have a much higher percentage of rectangularity in their environments than non-Europeans and so are more susceptible to Muller-lyer illusion. Muller-lyer illusion occurs when two lines of equally long parallel lines with arrow tails placed at the end visually appear longer.

Coherence Bias. The plan to implement the requirements as a decision rule has been agreed by domain experts. But this plan fails to meet the decision criteria for cognitive adequacy and safety requirements. Warnings about the inadequacy are dismissed as soon as raised. I suspect group—think bias [ 4 , 36 ]. Fault and event tree analysis bias. Some of the above latent human factors that may contribute to any of the potential ERTMS accident was noted by Sanjeev Appicharla in [ 6 , 8 ]. The author refers the reader s to an excellent online report by Felix Redmill in the computing science domain on how to judge if the safety risks are ALARP via a decision-making process [ 57 ].

Simon, Daniel Mcfadden, and Daniel Kahneman in particular [ 36 , 42 , 67 ]. The author does not subscribe to the idea automated risk assessement tools such as genetic alogrithms are of help. Readers may note that SIRI methodology is a engineering methodology to assist system and safety analysis of engineered systems by taking into account success and failure scenarios and based upon the theory of decision-making under uncertainty in the data and decision-making process [ 35 , 37 ].

The challenges posed by problems of complexity, causality, overconfidence, human error, hindsight and outcome biases, bounded rationality, economic choices, cognitive limitations, out of sight out of mind bias, halo effect, omissions and oversight has to be met by any methodology to be used for decision making for the assurance of system safety risk management of complex engineered systems [ 55 ].

In this section, idea as to why some wrong approaches to safety risk management relying upon risk-benefit analysis or fault and event tree analysis or reactive risk management persist was discussed.

Description

To manage the hazardous potential or actual situations, the different steps followed in the system and safety analysis as per SIRI methodology are as follows [ 3 ]. This activity is usually carried out within a team and process is used to elicit domain knowledge through representation of diagrams such that a validated design or concept diagram is taken as input to the next stage of the SIRI methodology;. Modelling accident scenarios causal analysis process through Energy and Barrier Trace Analysis to identify harmful energy sources, victims, and barriers; failures of barriers detected through the application of Management Oversight and Risk Tree questionnaire and compare results with Skills—Rules—Knowledge Framework and Swiss Cheese Model to identify latent errors and develop the Hazard and Causal Factors Analysis Report.

The last three steps may involve iterative process between them; processing of developing understanding may require intermediate stages to store the results on a draft version to revisit the branches of Management Oversight and Risk Tree MORT questions from engineering and risk management perspectives. A red, green, amber light marking system may be needed as each sequence of energy transfer process may need to be revisited. Further, as the original Management Oversight and Risk Tree in was developed with an understanding that at the design phase engineers and their managers will be able to perceive, concieve, and act upon the identified hazards before the close out of the design process [ 35 , 37 ].

However, as the railway domain does not use the concept of affordance of harm from the system as a design criterion as required by human factors engineering process, it is necessary to consider various heuristics used by designers and operators and resulting biases that may arise at the design as well as operational time in the risk assessment, safety verification, and valdiation phases [ 5 , 6 ].

To produce a model of an operational railway, the model should be able to reflect the real world closely. The operational railway includes several interfaces in all operational circumstances:. Man—machine interface driver—line signals, signaller—automatic route setting, driver—train, etc. Organisational interfaces safety standards, failure management, hazard control between duty holders, between duty holders and industry bodies, between various types of organisations.

However, the present modelling languages suffer from a disadvantage in the sense that they tend to superimpose their own order on existing systems and fail to capture the rich partial order present in the system. The application of the SIRI methodology to the incident situation under study is described. Shortly before hrs on Sunday, 19 June , a passenger train, travelling from Aberystwyth to Machynlleth, ran onto the level crossing at Llanbadarn while the barriers at the crossing were raised, and came to a stop with the front of the train about 31 metres beyond the crossing.

There were no road vehicles or pedestrians on the crossing at the time. The immediate cause of the incident was that the train driver did not notice that the indicator close to the crossing was flashing red until it was too late for him to stop the train before it reached the crossing. An underlying cause of the incident was that the signalling system now in use on the lines from Shrewsbury to Aberystwyth and Pwllheli does not interface with the automatic level crossings on these routes.

These cover the development of engineering solutions to mitigate the risk of trains passing over automatic crossings which have not operated correctly; changes to the operating equipment of Llanbadarn crossing; the processes used by railway operators to request permission to deviate from published standards; the operational requirements of drivers as trains depart from Aberystwyth; and the way in which drivers interact with the information screens of the cab signalling used on the Cambrian lines. The diagram does not show actual mental world of an individual, but it is a model or a representation to be used by SIRI Analyst to reason about certain behaviour in philosophical, teleological, cultural, and scentific traditions of thinking and reasoning reflected in the literature on risk.

However, it should be noted that this model does not reflect real truths. As Prof David Hand has written, one must revert to religion or pure mathematics for learning absolute truths [ 26 ].

To enable easy comprehension of the context of the UK railway industry operations, a system diagram is prepared. This is shown in Figure 1. Element organisations like Alstom, Siemens, Ansaldo, Bombardier, Invensys, and Thales that supply signalling solutions are represented as contractors. Notified bodies or project safety organisations are treated as entities acting as contractors providing safety auditing, assessment, advice, and accredition.

The brief details of European Process validation and certification process is defined in the Section 5. Architecture Context Diagram of the Railway Industry. The solid red lines in the above figure indicate safety critical interfaces and functions and dotted red line indicates influences emerging from accident investigations. Symbol 2 indicates Passenger Focus is an independent body set up by the UK Government to protect the interests of passengers. The description of hazard identification and analytical methods used in the SIRI methodology is available in the published litearture.

It is a hazard identification technique promoted by the UK Intuition of Chemical Industry in the early s [ 17 ]. Reading of the para and subsequent text of RAIB Report suggests that signaller made mistake in setting the routes which led to a timing sequence problem, leading to the event of the opening of the barriers prior to the event of train passing over the crossing space. This discovery should provoke thoughts on the requirements management process used in the programme management of ETCS programes.

Web Intelligence Meets Brain Informatics

If a real HAZOP study were to be conducted, then this failure may provoke thinking about the adequacy of study of failure scenarios and barriers as well. Absence of the road user at the crossing space averted the potential accident. The real accident, if it had occurred, may have led to a range of outcomes with the loss of life as well as public and media outrage, if too many fatalities had resulted from it. At the minimum, actions as planned under the risky scenario of raised barriers and stopping train front stopping 31 metres beyond the crossing space would have certainly resulted in a collision between the road and the rail vehicle, given the present understanding of laws of physics [ 48 ].

The SIRI methodology adopts a system-induced error approach; and therefore, it is necessary to look at errors from a holistic perspective. It shows intelligence failure on the part of all organisations. Turner as well [ 78 ]. The MORT diagram uses of the logic of fault tree. It contains two main branches. One related to control of technical factors denoted by letters SB, SD, etc. Another branch relates to management branch denoted by letters such MA, MB. Leaves within these branches are noted by lower case letters a1, b2, etc.

This is shown in Figure 4.

The description of the expected event sequence to form a coherent description uses a particular notation of ECF analysis. The criteria to be used to read the event sequence diagram follows. Events are enclosed in rectangles and connected to other events as a forward chain using horizontal arrows,. Colour coding is used to distinguish infrastructure manager IM , railway undertaking or train operating RU domain, and user domain,.

Events are labelled with number or letters to identify the sequential flow of events in respective duty holder domain. The Concept of Operations describes the operational scenario when the train is approaching the warning board and the train driver is in vigilant mode of information processing. However, the desription of the incident RAIB paragraph informs that the event of approaching the warning board was delayed due to train entering Staff Responsible, SR, mode [ 70 ].

This is a latent error embedded into the system where engineering and organisational errors are committed. Further, the RSSB statement does not fit the idea of systems thinking that whole is more than sum of its parts. This idea is entertained by system engineers as well as human factors specialists. Given the Concept of Operations diagram which the author has developed, introducing timing analysis into the scheme is not a difficult issue if the data from the human factors engineering is included as well.

From the direct inspection of events sequence described in the diagrams shown in the Figure 4 , the expected event labelled E-IM in the Network Rail domain flashing white aspect, and contrary to the expected event labelled E-RU-6 in the Arriva Train Wales domain, the red light was perceived, suggesting that the barriers were raised and an obstruction may be expected in the path ahead of the train. Realising this fact, train driver applied brakes but the train did not stop short of the crossing.

Representation of the concept of operations of ABCL Facility signalled by traditional lineside signalling. From the RAIB Report s , the information available from the stakeholder organisations websites, the following worksheet is generated. This worksheet forms the starting point of the root cause analysis.

Logic of combinations may be applied to the following table. Credit to God is given at the user level crossings where no indication of approaching train can be percieved or passenger manages to escape the accident [ 73 , 74 ]. Otherwise, the table indicates that level crossings are accidents waiting to happen.

Login using

The ECFA activity yields information on unsafe acts. The draft Latex style file is available on the Elsevier Web page if you use Latex. The goal of the journal Cognitive Systems Research is to be one of the best journals in cognitive science broadly defined , providing a top-quality, truly interdisciplinary forum for a broad cross-section of scientific communities.


  1. The Innocence Game?
  2. Featured channels.
  3. Related Books.
  4. Keyword Search.
  5. The Body as Medium and Metaphor. (Faux Titre)!
  6. Elsevier is the publisher of this journal. Summary Program info Your future Requirements Visit program website. Why this program? Use tools from computer science, linguistics, philosophy, and psychology to study the basis of human thought, its development, and the ways in which it can be used effectively.

    Gain skills that you can use in academic, industrial, and governmental and non-governmental organizations to help make the world a better place. Establish deep and broad foundations on which to build an understanding of the mind. View larger image UBC's Vancouver campus. You'll learn how this knowledge can be used as a force for change Find out more. Related programs. English-language requirements English is the language of instruction at UBC.