Past Project
Core Understanding Click to find out more Confidentiality Click to find out more Integrity Click to find out more Availability Click to find out more Solutions/Best Practice Click to find out more
  • Current Projects
  • Researcher Area
  • News & Press Releases
  • Blog
Lehman, the financial crash, and the making of history
  Archivists typically select around 5 per cent of organisational records, such as board minutes, public statements ...
More News >

Past Projects

Records and Information Management for Financial Analysis and Risk Management Workshop

Start Date: 2011-05-22 | End Date: 2012-05-22

Project Description:


Workshop on  



1. Introduction and Background

The Global Financial Crisis has drawn attention to the importance of financial records and information management in support of financial analysis and risk management. A lack of rigorous research on and best practice standards for records and information management in this domain have led to operational risks in financial institutions, flawed bankruptcy and foreclosure proceedings following the Crisis, and inadequacies in financial supervisors’ access to records and information for the purposes of a prudential response.2 Financial analysis and risk management, a concern of researchers in finance, is dependent upon effective information processing, while archival science, management information systems and information science address research questions of relevance to the problem of how best to inscribe, manage and preserve records and information. Yet rarely do researchers from these different disciplines interact to consider how their research might best inform both financial records and information management and the social and economic benefits that such research should provide. If seemingly independent domains of discourse - finance, management information systems, archival and information science - were to engage in a common conversation, it is possible that these domains would benefit from deepened perspectives and new insights.   


The theory and practice of records and information management is a critical issue with potentially significant economic and social consequences.  


This will be a small scale workshop to encourage the level of intensive interaction among researchers needed to engage in fundamental questions about records information management for financial analysis and risk management. The objective of the workshop is to communicate and advance research and scholarship on this critical issue by facilitating direct interaction among researchers and students, both from Canada and abroad and from different disciplines, in order to identify and formalize the fundamental theories, policies and models of records and information management that are critical for robust financial analysis and risk management. The workshop builds on and extends the discourse commenced at a U.S. National Science Foundation funded workshop held in July, 2010 on a similar theme.


2. Workshop Topics

The workshop, which will take place over one and half days will draw together experts and students to discuss the following topics: 1) Governance – the governance structures that need to be in place to ensure that records and information needed for effective financial analysis and risk management are available to all market participants; 2) Analytics - ways of representing and communicating about financial records and information (knowledge representation) and use of visualizations in the analysis of financial records and information (visual analysis); and 3) Life cycle management – the long-term availability and preservation and access to financial records and information, which introduces a new research theme to the discourse on financial information management.


3. Participant Bios



Michael Atkin has been a professional facilitator and financial information industry advocate for over 20 years. He is currently the Managing Director for the Enterprise Data Management Council -- a business forum for financial institutions, data originators and vendors on the strategy and tactics of managing data as an enterprise-wide asset. Mr. Atkin is an active participant in standards initiatives and has been involved with many organizations including the Reference Data Coalition (REDAC), the Securities and Financial Information Markets Association (SIFMA), the Association of National Numbering Agencies (ANNA) and the UK Reference Data User Group (RDUG). He was also a member of the SEC's Advisory Committee on Market Data and a member of both ISO TC68 and ANSI X9D. He has been the Managing Director of the EDM Council since February 2006. His expertise and extensive knowledge on the subject of financial information management will be critical in facilitating and contributing to breakout discussions.


Willi Brammertz obtained his PhD in Economics from the University of Zurich. His doctoral thesis discussed the elementary parts of finance, from which various financial analysis can be simply derived. Dr. Brammertz is continually refining and developing this concept while maintaining the foundation laid in his thesis. In 1992, together with Dr.Jürg B. Winter, he founded IRIS integrated risk management ag. At first, he focused on consulting projects in the area of Asset and Liability Management, implementing external systems at several banks. Beginning in 1996, he applied his insights from his doctoral thesis as the Chief Technology Officer by creating riskpro™. In 2008, Iris was sold to FRSGlobal, a leading provider of regulatory reporting in the banking sector. Dr. Brammertz, now an independent consultant, regularly speaks at international conferences on risk management and regulatory compliance and has published numerous articles. In 2009, Wiley & Sons published Dr. Brammertz’s first book, “Unified Financial Analysis – the missing links of finance”. Dr. Brammertz wrote his book, which is co-authored by distinguished colleagues, based on his more than twenty years of experience in financial analysis.



Thomas Dang is currently a Masters student in Computer Science, specializing in Human-Computer Interaction, at the University of British Columbia and is a resarcher at the Centre for the Investigation of Financial Electronic Records (CiFER). Thomas is also a member of the Visual Cognition Laboratory and the Media and Graphics Interdisciplinary Centre at the University of British Columbia. His M.Sc. research topic is in the domain of visualization and analysis of large, heterogeneous bodies of information in Finance. Thomas will be
presenting on his research and his participation in this workshop will provide him with an opportunity to receive feedback on the progress of his research.


Brian Fisher is Associate Professor of Interactive Arts and Technology and Cognitive Science at Simon Fraser University, and a member of the SFU Centre for Interdisciplinary Research in the Mathematical and Computational Sciences. At the University of British Columbia he is the Associate Director of the Media and Graphics Interdisciplinary Centre  (MAGIC), Adjunct Professor of Computer Science, and Associated Faculty in Psychology as well as a member of UBC Brain Research Centre and the Institute for Computing, Intelligent and Cognitive Systems.  His research focuses on the cognitive science of human interaction with visual information systems, with the goal of developing new theories and methodologies for development and evaluation of technology to support human understanding, decision-making, and coordination of operations.


Mark D. Flood did is undergraduate work at Indiana University in Bloomington, where he majored in finance (B.S., 1982), and German and economics (B.A., 1983). In 1990, he earned his Ph.D. in finance from the Graduate School of Business at the University of North Carolina at Chapel Hill. He has taught finance and business at universities in the U.S. and Canada, and worked as an Economist and Financial Economist on issues of regulatory policy and risk management at the Federal Reserve Bank of St. Louis, the Office of Thrift Supervision, the Federal Housing Finance Board, and the Federal Housing Finance Agency.  He was a founding member of the Committee to Establish a National Institute of Finance.  He is currently a Senior Policy Advisor in the U.S. Treasury, working for the Office of Financial Research.  His research has appeared in a number of journals, including the Review of Financial Studies, Quantitative Finance, the Journal of International Money and Finance, and the St. Louis Fed's Review.


Alexandros-Andreas Kyrtsis is a Professor of Sociology at the University of Athens, in the Department of Political Science and Public Administration. His current research focuses on the analysis of techno-organizational backstage of financial markets, of financial representations, and of risk management processes in complex projects. He has been academic visitor at MIT, LSE, the University of Edinburgh, at the Institute of Advanced Studies on Science, Technology and Society in Graz, and at the Swiss Federal Institute of Technology in Zurich (ETH Zurich). He has also been adviser to Greek banks, to the Hellenic Bankers Association, and to IT companies with projects in the financial sector. His latest publication is an edited volume entitled Financial Markets and Organizational Technologies: System Architectures, Practices and Risks in the Era of Deregulation (Palgrave Macmillan, 2010).


Lior Limonad has been a researcher with the Centre for the Investigation of Financial Electronic Records (CIFER) since October 2009 and is also a researcher at the IBM Haifa Research Lab. In the relatively short time that Lior has been a member of the CiFER research team, he has made invaluable contributions to the progress of research on financial records and information management. Specifically, he has helped to identify tools to move from informal (text based) descriptions of information failures in the collapse of Lehman Brothers to a formal language with which to model the collapse and failure. This has been no small task, as Lior has had to quickly gain a sufficient understanding of basic archival and financial concepts and theories (e.g., the financial derivatives supply chain and various flavours of derivative products that contributed to the collapse of Lehman Brothers) to enable him to guide the team in identifying and defining key constructs in our model. His knowledge of his field and ability to see the interconnections between his discipline and other disciplines has allowed him to propose original applications.



 Allan Mendelowitz has been on the board of directors of the Federal Housing Finance Board since 2000, and he served as the board's chairman from 2000 to 2001. Previously, he was the executive director of the U.S. Trade Deficit Review Commission, a congressionally appointed bipartisan panel. Dr. Mendelowitz has also served as the vice president of the Economic Strategy Institute--supervising research on trade policy, international competitiveness, and telecommunications policy--and as an executive vice president of the Export-Import Bank of the United States. From 1981 to 1995, Dr.. Mendelowitz was the managing director for international trade, finance, and economic competitiveness at the General Accounting Office. He is the co-leader of the Committee for the Establishment of a National Institute of Finance and key influencer of U.S. policy in the domain of financial information management. .


Kafui Monu is post-doctoral researcher at the University of British Columbia’s Sauder School of Business, specializing in Management Information Systems at the Sauder School of Business and has been a researcher at the Centre for the Investigation of Financial Records since May 2010. His main research interest is involving users in the system analysis process by representing their view of their work. This research focuses on developing a technique to represent the users' view of their behaviour in the organization, and is called the Organizational Actor Modeling Methodology. This technique closes the gap between the user and the developer in the software development process by providing a structured model of the users' unstructured data. The work has been used to successfully represent scenarios in disaster management and retail. Participation in the workshop will provide this student with an opportunity to present his research and to meet leading researchers in the financial information management.


 Fred Popowich is the CEO of the Vancouver Institute for Visual Analysis (VIVA) and, prior to taking this position, was Associate Dean of the Faculty of Applied Sciences at Simon Fraser University. He is an Associate Member of the Department of Linguistics, and an Associate Member (and past director) of the Cognitive Science Program at SFU. He is active in technology commercialization, chaired the committee responsible for Canada's language technology roadmap, and was co-founder, president , COO and then CTO of Axonwave Software. He is currently a member of the Precarn Expert Advisory Panel and is on the Council of Partners for SFU Venture Connection. Dr. Popowich will provide workshop participants with an overview of visual analysis technologies and will lead discussion on the use of this technology in solving financial information management problems.


Rachel Pottinger  is an assistant professor in Computer Science at the University of British Columbia. She received her PhD in computer science from the University of Washington in 2004. Her main research interest is data management, particularly semantic data integration, how to manage metadata (i.e., data about data), and how to manage data that is currently not well supported by databases.  Dr. Pottinger is part of the NSERC Business Intelligence Network (BIN), which aims to enhance Canadian business competitiveness through the development of intelligent data management and decision-making solutions.


Jim Rhyne is VP, Business Development at Sandpiper Software, Inc. and a partner at Thematix Partners.  He is actively involved in the Open Financial Data Group Forum and is the primary semantic architect on the OMG-EDM Council Proof of Concept project. He has also worked on the EDM Council semantic repository (now called FIBO). Formerly with IBM, he retird a few years ago to start a consultancy, and his background includes working with Ted Codd on the original System R project, many years of research in AI and Knowledge Representation, and a number of years as the CTO of IBM’s Worldwide Banking Center of Excellence. He holds a Ph.D. in AI and Computational Linguistics and taught for several years before accepting the job offer to work on the System R project.


Anya Savikhin is a Research Scholar at the University of Chicago Becker Center on Chicago Price Theory. Savikhin uses the methodology of experimental economics to explore behavior and decision-making in different contexts. Savikhin’s research centers on developing and experimentally evaluating the impact of novel interactive visual analytic tools on economic decision-making and risk choice. To date, Savikhin has investigated decision-making in a wide range of contexts, including financial portfolio selection, information overload during the consumer search process, and optimal choice under uncertainty. Savikhin is also affiliated with the Vernon Smith Experimental Economics Laboratory at Purdue University, the Center for Financial Security at the University of Wisconsin-Madison and the Financial Literacy Center at RAND and Dartmouth College. Savikhin received her Ph.D. in Economics from Purdue University in 2010.


Carrie Stevenson is Manager of Corporate Records Services at the British Columbia Securities Commission (BCSC) where she is responsible for the records programme and plays a key role in the information management strategy.  Carrie holds an MAS from UBC, and prior to joining the BCSC, she worked in a number of corporate environments including the World Health Organization, the Organisation for the Prohibition of Chemical Weapons, and the International Monetary Fund.


Christina Wolf is the Chief Economist at the British Columbia Securities Commission. As Chief Economist, she has brought evidence-based analysis to a wide range of policy initiatives including securitization, OTC derivatives, systemic risk, credit rating agency oversight, and capital raising exemptions. Christina is also responsible for areas of the commission relating to enterprise risk management, data analytics, and strategic planning. Before joining the BCSC, Ms. Wolf spent eight years working as a consultant and Practice Area Manager with the Boston Consulting Group.



Carson Woo of the Sauder School of Business at the University of British Columbia will present on his current research focus related to studying how to effectively support the change and evolution of information systems from the business and organizational perspective. Business changes are becoming more frequent due to competition, deregulation, globalization, and other factors facing them. Rigid and inflexible information systems are obstacles to effective organizational and business changes. In order to develop flexible and adaptable information systems, Dr. Woo has been experimenting with the effectiveness of incorporating and utilizing contextual information in the information systems architecture. Contextual information, in this case, includes commonly use concepts in Intelligent and multi-agents systems such as goals, beliefs, and intentions, organizational concepts such as organizational structures, roles, and responsibilities, and business concepts such as mission, market, and regulations.  


 Sherry Xie is holds an MLIS, McGill University, MAS, University of British Columbia, and is a PhD student at the School of Library, Archival and Information Studies, University of British Columbia. She has worked as subject librarian and records manager in academic and government settings and is currently conducting research on electronic records management in highly regulated environments. She has been working since 2004 for the InterPARES Project as a graduate research assistant on all aspects relating to electronic records management and digital presentation under the direction of Dr. Luciana Duranti. Sherry joined CiFER in September 2008.






 5. Logistics

The 1.5 day workshop will take place on Wednesday, August 24 and Thursday, August 25, 2011. The workshop will be held at the Blue Horizon Hotel, Vancouver, British Columbia. Participation in the workshop will be by invitation only, and will include experts in the relevant fields, student presenters and student observers.



 6. Workshop Report and Dissemination

Prior to the workshop, the conference discussion papers will be available that will be used to frame discourse during the workshop. Immediately following the workshop, a subset of attendees will be invited to participate in a half day meeting to produce a draft of the workshop report. The discussion paper and summaries of discussions held during the workshop breakout sessions will be incorporated into an edited volume to be published by Springer. The volume will include a comprehensive discussion of the challenges in this domain and set out a research agenda. It will be widely distributed within both the academic and professional communities concerned with financial records and information management. Versions will be published in journals in finance, management, and archival and information science. In addition to communicating and disseminating the results of the workshop, it is planned that the workshop report should provide a foundation and action plan for future research grants, including a SSHRC Partnership Development Grant.


7. Programme


Wednesday, August 24, 2011


11:30 a.m. – Conference registration

12:00 p.m. - 1:30 p.m. Buffet lunch and presentation

  • Acknowledgments: Introductions and thank-yous. Explanation of the structure of the workshop and what we expect and hope for from the breakout sessions following each session. Victoria Lemieux (UBC)


  • Data Implementation Challenges for Systemic Rsk Monitoring. Mark Flood (Office of Financial Research) and Allan Mendelowitz (Committee to Establish the US National Institute of Finance)


This paper outlines some of the core implementation challenges that will confront regulators and market participants as they seek to better address systemic risk going forward.  Central to the challenge is the large number and diversity of individually specialized information creators and providers (market participants) who are interconnected by a complex web of contractual relationships.  Traditional accounting and supervisory processes are not adequate to the task.   One required solution to the data problem is to focus on a contract’s “financial meaning.” The core of this representation is the cash flow commitments – often contingent on other factors – between the counterparties to the contract.  Understanding this key objective is necessary at the very beginning of the process because in practice, it is possible for two contracts or portfolios to generate substantially identical cash flow patterns (financial meaning), even when their legal or machine representations differ widely.  Indeed, much of financial engineering is devoted to repackaging a fixed set of cash flow commitments into a different contractual configuration, perhaps to avoid taxable events, reduce the market impact of a trade, or simply to obfuscate the activity.  


 We focus in this paper on two broad requirements:  (b) the functional accessibility of information; and (c) the “cognitive” capacity of the organization.


Functional accessibility of information is what provides the ability to navigate, query, link, and define the data; in short, it is the implementation of understanding.  This represents a combination of data, metadata, and navigation tools that are integrated within a coherent design framework.  This combination of organized content and tools, along with the documentation and training required to socialize their use, provide the platform to enable “cognitive capacity” within a given community or organization. An important building block is the stewardship of a reliable set of legal entity identifiers (LEIs) to track counterparties consistently across the system and over time.  The first step in modeling interconnections in the financial system is identifying the entities that are connected, the types of entities and entity connections, and the key information transmission paths among multi-channel transmissions.  The second key building block is identification of financial contract types as determined by their financial meaning.  While these financial instrument types can be represented in many different ways, including traditional legalese and structured machine representations (e.g., FpML), and while these representations may be important for other purposes,  the challenge of understanding systemic risk is tied in the first instance to the financial meaning of the contracts.   The sheer volume of the data presents many data challenges. For example, the machine representations of the contracts might be mapped to a semantic context (e.g., a semantics repository), to provide additional interpretive specificity.  Both the message schemas and associated semantics should be versioned over time.  Metadata also matters especially for data dissemination:  financial exchanges, regulators, and other participants share a wide range of information – including both raw data inputs and calculated outputs – with each other and with third parties.  Specialized schemas should be applied to support open publication, archival conservation, library cataloging, and reproducibility by researchers.  Maintaining dissemination metadata consistently over time and across multiple distinct schemas presents its own integration and versioning challenges.


Analytic capacity constrains the ability of the organization to assimilate data to support decision-making and other concrete actions and policies.  Because systemic risk brings the entire financial system within scope, data volumes are potentially enormous.  Techniques for automated discovery, inference and pattern recognition to triage incoming data will be vital, especially for systemic risk supervisors and regulators.  Similarly, techniques of data presentation and visualization to support decision-making and executive summaries will be crucial.  These include dash-boarding, dimensionality reduction, and animations.



1:30 p.m. – 2:15 p.m.  Theme 1: Data Governance for financial analysis and risk management

Brief presentations on records and information requirements for firm-wide and systemic-level risk management applications (including market, credit, liquidity, and other risks.) and the connection between records and information management and operational risk.


  1. “Making it operational”: avoiding operational risk in the management of records and information  for financial risk management – Willi Brammertz (Brammertz consulting)


We argue that the single largest operational risk for the OFR is to be inundated with un-decipherable financial contract data. We continue to argue that clear semantics on the attribute level alone is not sufficient but the OFR needs a semantic capable of describing the entire intent of the financial contract. In a next step we develop the concept of Contract Types (CT), which encapsulate this semantic. The idea of CT’s overcomes the today’s cherished (and un-reflected) separation of data and algorithms, which lies at core of the observed data chaos in banks. We distinguish mechanical parts of finance where the separation is contra productive and experimental parts, where the separation makes sense. We conclude with a model that makes the OFR operational.


 2:15 p.m. – 3:00 p.m. Breakout Discussion on Theme 1

Facilitator: Mike Atkin


3:00 p.m. – 3:15 p.m. Break


3:15-4:00 p.m. Theme 2: Analytics - Information architectures for financial analysis risk management

Brief presentations on some current research on ontologies, schemas and models related to issues of representing financial domains and financial risks.


  1. Approaches to the modeling of complex, dynamic domains – Carson Woo (UBC), Lior Limonad (IBM – Haifa), Kafui Monu (UBC)



Financial records can be used to help identify risks associated with financial transactions since they provide evidence of how a transaction may occur (e.g., legal contracts that define the cash flow payments in a fixed income instrument) or of how they have occurred (e.g., trade records that record the details of a completed trade) and are necessary for regulation of the financial system and in order to assert legal and financial claims. In some cases financial institutions do not adequately record information which can lead to many problems, such as have been observed to have occurred during and in the aftermath of the financial crisis of 2007-2008. In these cases it would be useful to understand exactly how missing or incomplete records affected the situation, or why they were missing or incomplete, in what ways this situation may have contributed to the build up of risk in the financial system, and how the situation could be rectified or even prevented.


In this paper we suggest using conceptual modeling, a technique more often used for systems development, to gain an understanding of financial-records creation, transmission and management in the processes along the retail mortgage-backed securities originate and distribute supply chain. Conceptual modeling is the act of representing the substantial and social domain using specific abstractions of it. The paper explores three different conceptual modeling techniques to explore the relationship between records and risk in the context of the financial crisis. In the first example we discuss how one type of conceptual model can be used to quickly uncover gaps in and verify assumptions about researcher domain knowledge and we illustrate how the approach led to important realisations about the domain essential to understanding the relationship between records and risks. In the second example, we use a previously developed instrument-centric modeling language (aka, the ‘INCA’ model) to conclude missing modeling parameters that can lead to better understanding of risk factors when used to capture information failures along the retail MBS originate and distribute supply chain. In the third example, we discuss how conceptual modeling can be used to help articulate the human aspects of records creation and keeping behaviour that contributed to records and information risks. From the experiences with using these different conceptual modeling approaches, we conclude that conceptual modeling has shown its value as a tool to help understand and model relationships and dynamics in our domain of interest and as a tool to generate new insights that would otherwise have been more difficult for us to see.



4:00- 4:45 p.m.  Breakout Discussion on Theme 1


Facilitator: Mike Atkin


4:45-5:00 p.m.  Wrap-up of Day 1 and Logistics for Dinner and Day 2


6:00 p.m. - 9:00 p.m. Reception with Dinner (Location TBD)


Thursday, August 25, 2011


8:30 a.m. - 9:00 a.m. Breakfast


9:00 a.m. -10 a.m. Theme 2:  Analytics - Visual Financial Analytics and Risk Management

Brief presentations on current research on visual analysis of financial records and information for financial risk management


1. VA research and the Vancouver Institute for Visual Analysis (VIVA) – Fred Popowich


2.  The Application of Visual Analytics to Financial Decision-Making and Risk Management: Notes from Behavioral Economics - Anya Savikhin (U. Chicago)


 Understanding how individuals make financial decisions under uncertainty and with different information settings is fundamental to informing the theory and practice of information management.  Due to limitations on cognitive ability and problems of information overload, complex information sets may not be fully understood, resulting in sub-optimal decision-making by individuals and organizations. We have applied visual analytics (VA), which enables users to interactively discover information from large information sets, to improve the financial decision-making process. Using an experimental methodology, we find evidence that VA reduces the cost of obtaining information, improves decisions, and increases confidence of users in a range of different decision tasks involving risk. This is a nascent area of research, and additional work is needed to develop and rigorously evaluate appropriate VA tools for financial decisionmaking and risk management. A thorough understanding of best practices for presenting complex information sets may only develop through rigorous evaluation of the effect of information presentation on actual financial choices. In addition, the impact of VA in collaborative decision-making environments is not fully understood. The future of applied visual analytics for financial decision-making and risk management must involve an interdisciplinary team of behavioral economists, visual analytics researchers, and cognitive scientists.


3. VA technologies for financial analysis - Thomas Dang (UBC)


In this paper, I aim to understand and bridge the gap between Visual Analytics (VA) research and deployment in imperfect conditions to solve multi-layered, often vaguely-defined problems in the real world of finance with particular focus on investment analysis. The primary goal of this work is to create a functional classification of VA techniques with regard to investment analysis problems, as well as a table of existing products capable of supporting common investment analysis problems. With a functional classification and a table of off-the-shelf solutions, more effective and theoretically-grounded feasibility and cost-benefit analysis could be performed to justify and plan applications of VA in financial organizations. The secondary goal of this project is to explore and initiate an in-depth discussion on the challenges to deploying VA solutions in finance, such as problems in file formats and information management. These secondary challenges are relevant not only to VA, but also to the deployment of other automatic and semi-automatic data analysis techniques. To construct my functional classification of VA techniques, I carried out an extensive literature survey on common investment analysis problems, Visual Analytics techniques, sense-making and intelligence analysis theories, and previous attempts by other VA researchers in creating classifications of VA techniques. Using this functional framework, I evaluated over thirty mature VA products (skipping over many other products that are very limited in functionality or only partially developed), both in the commercial space and the free-software community. I also evaluate the information structure and data format requirement for the efficient deployment of this representative table of tools, and use the insights from this evaluation to frame a discussion about information/data management problems often found in financial organizations. Finally, I conducted a brief survey of data formatting and pipelining tools that could be used to aid the deployment of the VA tool table.


I also discuss the application of this functional classification in a case study in a Fixed-income investment company over the span of a summer. In this case study, I discuss how the research team systematically identified the areas for improvement in the analytic process of the company, and isolated the areas that could be improved with VA. The team then mapped these detailed problem definitions to VA techniques in order to find the optimal visualizations of the data. Finally, the team implemented a solution for the company by building upon one of the free toolkits that I have evaluated in order to achieve all the analytic goals in the least amount of time and expenses. At the end of this sub-project, the solution is evaluated by the company to demonstrate quantifiable contributions to their fixed-income portfolio analysis workflow, as well as to the representation of analytics results to the company’s clients.


10 a.m. – 11 a.m.  Breakout Discussion on Theme 2

Facilitator: Mike Atkin


11 a.m. – 12 p.m. Theme 3: Life Cycle Management - Long-term preservation of financial electronic records

Brief presentation on the issues around long-term preservation of financial records and information


 1. Coping with Messiness and Fogginess in Information Management: Material and social aspects of financial representations in custodial services and proprietary trading


- Alexandros-Andreas Kyrtsis (National and Kapodistrian University of Athens)


The narratives, which shape personal and social identities, and the perlocutionary acts (speech acts of persuasion), which frame the minds of the actors who participate in financial markets, define also the structuring (the in-formation) of data these actors exploit with the use of technologies for recording, retrieving and reconfiguring. The interpretation of these pieces of information through dominant financial representations is the basis of all kinds of financial decision making at the various echelons of organizations involved in financial markets. Both the storage of data (by following rules of record-keeping and data entry) and the retrieval of data (enabled by pattern orientation) are driven by institutional conditions and by the representations implied by the latter. Representations are forms of discursive and/or pictorial familiarization with what actors perceive as realities; in this sense they are rationalizations of fuzzy images emerging from institutional facts as defined by John Searle. In the financial markets, the materiality of which depends on the technological components of financial networks and on the techno-organizational processes taking place in various intermediaries, these representations emerge from the interplay of narratives and perlocutions with financial technologies. Financial technologies are understood here as illocutions (speech acts promising or directing the accomplishment of acts with material consequences as prescribed by obligations) originating from the authority of experts who are regarded as bearers of knowledge on how ordered activities of manipulation of informational objects can create value in financial intermediation. This embeddedness of operations in financial representations bound to financial technologies implies messiness and fogginess. Both phenomena should not be necessarily traced back to negligence, sluggishness or recklessness. They have more often to do with anomic aspects of unintended consequences of action. Messiness and fogginess are phenomena originating form the discrepancies between the representations which define the processes of data entries and storage and the representations which define the processes of retrieval and configuration. Messiness is a problem of manipulation of data, due to the detachment of representations driving push technologies from the representations driving pull technologies. Fogginess is the reverse: Users of data, aiming at the creation of information and further of knowledge, are under stress to decide about representations; and they often fail and feel helpless unless they grab the first available but not appropriate stereotype. Fogginess creates a veil between adopted representations of realities and the data. Actors do not feel that their way of shaping information corresponds to the intentional states which have led to the structure of the data they consider as available. The historicity of the data can lead to both of these problems operators are facing. Messiness and fogginess detach perlocution form illocution and consequently detach representations from already stored data, which under certain circumstances can constitute extensional states, i.e. intrinsic features of financial reality defined by the scarcity of resources. We have then a gap between institutional and brute facts which can be observed in many facets of financial operations, but more intensely in proprietary trading, and in the case of custodians who have to cope with messiness and fogginess, if they want to retain sensible levels of risk exposure and acceptable quality of their products and services.




 2. An Overview of Current Digital Preservation Approaches – Sherry Xie (UBC)



The preservation of digital records has been a challenge for the records community since the early 1970s when the main type of digital records were datasets produced by mainframe computers. With almost four decades having elapsed, the challenge continues to exist and even grows in complexity due to the accelerating deployment and rapid advancement of digital technologies. The universal presence and growing complexity of digital technologies present challenges to the records community in not only conceiving technical solutions but also understanding the very basic concept of record. The concept of record is the backbone of both the discipline of archival science and the two professions associated with it, i.e., records management and archival administration. The answers to what a record is and, more specifically, what a digital record is are indispensable to, and in fact the foundation of, any effort that archival institutions or records management programs exert to establish preservation strategies with efficiency and effectiveness as the goal. The search for these answers, however, proves not to be easy. General definitions of the term record and digital record can be said to be abundant, as almost all leading sources dedicated to the topics of digital records preservation or digital records management recognize the importance of distinguishing records from other types of digital information. However, the application of these concepts is often incomplete and insufficient to support a fully articulated digital preservation solution.  As a result, the guidance the concept should offer in respect to digital preservation is only partially incorporated into institutions’ preservation strategies. This paper, through analyzing the developments of the research project entitled International Research on Permanent Authentic Records in Electronic Systems (InterPARES), will illustrate various challenging aspects presented in the process of identifying digital records, the currently available solutions, and the urgent need the project discovered to further investigate how to identify digital records and embed digital records preservation strategies into business process design.      


12 p.m. – 1:15 p.m.  Lunch


1:15 p.m. – 2:15 p.m. Breakout Discussion on Theme 3

Facilitator: Mike Atkin


2:15 p.m. - 3:15 p.m. Summary of Breakout Sessions and Partnership Planning

Facilitator: Victoria L. Lemieux


3:15 p.m. - 3:30 p.m. Break


3:30 p.m. – 5:00 p.m. Summary of Breakout Sessions and Partnership Planning, cont’d

            Facilitator: Victoria L. Lemieux


5:00 p.m. – 5:30 p.m. Wrap up of Workshop and Logistics for Dinner


Friday, August 26, 2011

9:00 a.m. - 12 p.m. Workshop Report Writing (conference organizers only) and Tours (interested conference invitees). 


7. Key Dates

June 15 - abstracts will be posted to the website

July 31 - draft papers will be available to workshop participants

30 September - final papers will be submitted for copy editing

31 December - final manuscript will be submitted to Springer for publication

Q1 2012 - published workshop proceedings will be available


Records and Information for Financial Analysis and Risk Management: Governance, Analytics and Life Cycle Management

Principal Investigator: Victoria L Lemieux

Research Assistants(s): Sean Roncin

Funded by:SSHRC