Presentation is loading. Please wait.

Presentation is loading. Please wait.

Accounting Systems Technology for the 21st Century

Similar presentations


Presentation on theme: "Accounting Systems Technology for the 21st Century"— Presentation transcript:

1 Accounting Systems Technology for the 21st Century
Presented by Liv A. Watson

2 Today’s Agenda Accounting systems technologies and future trends
There will be two 10 minutes breaks and a 1 hour lunch XBRL Questions and comments are always welcome

3 Scalable Architecture Accounting as a System
Functionality Professional Analyst Casual Viewer Internet Browser Field Unit

4 The Traditional Accounting Information Systems Architecture

5 PCs and File Servers: A New Paradigm
The phrase “terminal/host” is used to describe computing architectures in which all data processing is performed in applications on a central host computer (almost always a mainframe or minicomputer), and the users interact with the hosted applications via dumb terminals. Terminals: can be thought of as a monitor that acts as a window to the processing being done by the host computer usually have some type of keypunch mechanism, through which the user can enter data and simple requests for the host computer do not have any computing abilities or storage capacity, because they do not have processors or memory. Terminals resemble drive-through speakers at a fast food restaurant. You can order food, but you don’t make it and don’t control how it’s made. The speakers will also not remember that you do not like pickles, no matter how many times you say it. Disadvantage: 1. Terminals do not support graphical application interfaces. 2. Host applications are generally not very user-friendly.   3.  Terminal/host systems are about as proprietary as you can get. 4.Terminal/host vendors are not very innovative. 5 Terminal/host systems are not very scalable.   6 Terminal/host systems are more sensitive to performance problems and failures than most other architectures. 7. Terminal/host systems require specialized knowledge to operate. PCs and File Servers: A New Paradigm process data without having to schedule time on a central host computer. The PC combined the processing power of the host computer with an interface that shamed the dumb terminal…and put them both in a package that could fit on a desktop! The PC also took advantage of a new phenomenon: shrink-wrapped software. Shrink-wrapped software: Was cheap Was commonly available Could run on computers from different vendors Easier to create, use, and maintain By the mid-1980s, PC users had figured out a few different ways to link their machines together over small areas, forming local area networks (LANs.) Local area networks allowed PC users to: share files on connected machines without manually exchanging disks or tape drives share peripherals, such as printers and scanners. Accounting data could be stored in files on a central, dedicated computer called a file server. The file server was able to download, or serve, the accounting files to the individual PCs, which would then actually process the data. The data had to be downloaded to the individual PCs each time they wanted to process the data, creating a fairly intensive transaction overhead. PC/File Server Systems Disadvantages: 1. They are not scalable.   2. File server software is not very robust.   3.  File server software is not very sophisticated.   4. File servers frequently utilize locking mechanisms at the file level.   5. File servers do not always have versioning capabilities. Client-Server Architecture Client/server is a network architecture in which each node (device) on the network is either a client or a server. Servers are powerful computers or processes (a fancy word for a program) that are dedicated to serving clients. Clients are devices (usually PCs) that are used to run applications and/or request that the servers run applications. True client/server has the following characteristics: 1.      Service-based.   2.      Modular.   3.      Resource-sharing. 4.      Asymmetrical.   5.      Transparent.   6.      Open.   7.      Scalable.   High-integrity.

6 client/server architectures can partition data and distribute the computing workload between clients and servers

7 Multi-tier Computing Three Common Client Configuration·
If the client is used only to view an application that is hosted remotely on a server, it is known as a presentation client. Presentation clients resemble the dumb terminals in terminal/host systems in that no files can be stored locally. The application itself is not stored on the client, but it’s viewed through some type of abstract container, such as a primitive browser. ·        Thin clients are just a bit more advanced than presentation clients are. They offer the presentation layer, but they also allow files to be stored locally on disk drives for data entry validation. They may also provide help files and messaging capability. The thin client, like the presentation client, relies on a remote host to run all of the actual applications. ·        Thick clients (or, less politically correct, fat clients) have all of the capabilities of thin clients, plus they are able to actually run and manage applications locally. They have a great deal more memory than thin clients, and they may even have database engines. They are therefore more expensive than presentation and thin clients. Unlike presentation and thin clients, any upgrades or changes made to data on a remote host must be manually downloaded to the thick client. Thick clients are therefore harder to maintain than presentation and thin clients. Thick clients are best suited to situations in which the user needs fast, easy access to the data with a high level of processing speed. They are also beneficial when you will be frequently moving between applications during one computing session, or when you need to run multiple applications simultaneously. Otherwise, you should try to utilize a presentation or thin client whenever possible.

8 “2-tier,” “3-tier,” and “n-tier”
“A Tier” describes how the computing workload is distributed in a client/server system When you separate the application logic from the presentation and data storage layers, you create a third layer: the application logic later. So 2-tier becomes 3-tier. “n-tier” is used describe those architectures in which there is more than one application logic layer You may also hear the terms “2-tier,” “3-tier,” and “n-tier” used to describe how the computing workload is distributed in a client/server system. These are really just more descriptive ways of explaining the client/server workload distribution. In 2-tier client/server architectures, the application logic (the rule-based processing part of a complete application) is located either on the client with the presentation layer or on the server with the actual data storage (or both.) An example of a 2-tier architecture would be a database server in which the data is stored inside the database along with the rules for how that data is to be manipulated. When you separate the application logic from the presentation and data storage layers, you create a third layer: the application logic later. So 2-tier becomes 3-tier. The term “n-tier” is used describe those architectures in which there is more than one application logic layer – the “n” being a variable that represents any number of application logic layers. Benefits of using 3-tier client/server architectures instead of 2-tier: 1. You are not tied to one database. Unlike 2-tier, 3-tier client/server architectures allow you to utilize data from multiple databases within one business transaction. 2. You have more communication choices. 2-tier architectures allow only synchronous, connection-oriented communications. 3-tier architectures, on the other hand, enable you to utilize a broad range of communication types, such as queued delivery, publish and subscribe, broadcast, etc. 3. Your systems will be more robust. The separation of application logic and data allows you to restart or alter one aspect of the system without affecting the other. The separation also allows for more efficient troubleshooting. Both benefits result in your system having higher availability, meaning that the system can be used when needed. 4. You can integrate legacy (or heritage) systems. The separation of application logic and data usually makes it possible for you to access pre-existing data storage and applications from within your client/server system, even when the legacy systems are based on outdated or proprietary standards. 3-tier systems let you access these legacy sources using special translators and conversion utilities called gateways. 5. You have more hardware choices. Two-tier architectures usually only allow you to draw your data from one server, whereas 3-tier architectures let you draw your data from as many servers as you want to (within reason, your system’s memory and processing capacity will impose certain limitations). 6. Better Internet support. The freedom you have in choosing communication methods and hardware support means that 3-tier systems offer better Internet support than 2-tier. 7. Easier implementation of load balancing. Load balancing is a means of distributing the computing workload across a network so that no single device is overwhelmed. Load balancing is especially important for systems in which the number of clients is unpredictable. If one server starts to become overworked, the computing load will be transferred to a server that can handle the work.

9 What is Governments role in the development and operation of the Net?
Internet Originally designed as a Defense Department resource, it was expanded to educational institutions and finally to include businesses and individuals, the Internet is quickly becoming a significant communication tool in our society. From e-commerce to research to messaging and for what ever you can imagine. The Internet makes resources available that it would be impossible for any company or individual to assemble. There are many issues that have arisen with the growth of the Net. A few of these include: ·        What is Governments role in the development and operation of the Net? ·        Should (and if so, how) states be able to tax activity on the Net passing through their jurisdiction? ·        Can we effectively eliminate or limit access to offensive material, especially by young people? ·        How can we improve the security of data traveling on the Net? As the Net continues to develop, more and increasingly complex issues are sure to emerge. Intranets use much of the same technology that fuels the Internet and the World Wide Web. Intranets however, are for the internal use of a company and its employees. An intranet provides access to all data on the network from any place, in the next office or across the world. In some instances the Internet may be used as a tool to implement the Intranet – serving as a transport mechanism. The distinguishing feature, when compared to the World Wide Web, is that users outside the company will not see or have access to this information. Limited only by imagination, intranets represent a powerful tool for companies to facilitate between different users. Extranets – Extranets build on the benefits of an intranet by extending information to selected outsiders. These might be suppliers, customers, financing sources or anyone else having a legitimate need for the available data. This can be useful in improving communications between trading partners, making data available on a “real-time” as needed basis. Other characteristics and challenges of an extranet include:  Multiplatform interoperability – the ability to operate with different industry standards as well as across multiple platforms. Scalability – the ability to adjust resources to the demands of the users. As more users are permitted to access the network, additional resources will be required to prevent degradation of service or system collapse. Reliability – to be effective it is important that the network be readily available when users need access. This requires properly maintained equipment, continues monitoring with problem notification, error logging and system reporting functions. What is Governments role in the development and operation of the Net? Should (and if so, how) states be able to tax activity on the Net passing through their jurisdiction? Can we effectively eliminate or limit access to offensive material, especially by young people? How can we I improve the security of data traveling on the Net?

10 Enterprise Resource Planning (ERP) Systems
MRP (Material Requirement Planning) MRP-II (Manufacturing Resource Planning). ERP Human Resources Finance Customer Service Engineering Institute of Management Accountants (IMA) provides a course, “Benchmarking ERP Systems” Manufactures are all familiar with MRP (Material Requirement Planning) or MRP-II (Manufacturing Resource Planning). Enterprise Resource Planning (ERP) incorporates other areas such as Human Resources, Finance, Customer Service, Engineering etc. into the process. ERP addresses system requirements, processing architecture, relational database management systems, bandwidth requirements etc. As businesses continue to re-engineer to meet the competitive nature of the marketplace – including downsizing, this type of tool will become increasingly important to assist in making good management decisions. Institute of Management Accountants (IMA) provides a course, “Benchmarking ERP Systems”, that includes topics on forecasting, planning, budgeting and managing the enterprise. The program also deals with topics such as financial reporting, analyzing products and customers for profitability, automation of policies, procedures and controls, among others.

11 Applying Information Technology to the Accounting Cycle

12 The operating system (O/S) is the most important program that runs on a computer:
Recognizing input from the keyboard Sending output to the display screen Keeping track of files and directories on the disk Controlling peripheral devices such as disk drives and printers The operating system (O/S) is the most important program that runs on a computer. Every general-purpose computer must have an operating system to run other programs. Operating systems perform basic tasks, such as: Recognizing input from the keyboard Sending output to the display screen Keeping track of files and directories on the disk Controlling peripheral devices such as disk drives and printers Whether or not you are aware of it, using any software application requires that you invoke the operating system. Whether you are a user of software or a programmer, you will come to appreciate the fact that the operating system takes care of many chores automatically. Operating systems provide a software platform on top of which other programs, called application programs, can run. The application programs must be written to run on top of a particular operating system. Your choice of operating system, therefore, determines to a great extent the applications you can run. For PC’s, the most popular operating systems are DOS, OS/2, and Windows, but others are available, such as Linux. Users normally interact with the operating system through a set of commands. Some command interpreters are text oriented requiring commands to be typed in. Other command interpreters are graphically oriented and let the user communicate by pointing and clicking on an icon, an on-screen picture that represents a specific command. NOTE: Beginners generally find graphically oriented interpreters easier to use, but many skilled computer users prefer text-oriented command interpreters because they are more controlling.

13 Operating systems can be classified as follows:
Single-tasking  Generally supports only one a process at a time      Multi-user  Allows two or more users to run a program at the same time       Multi-processing Supports running a program on more than one CPU Multi-tasking     Allows more than one program to run concurrently    Multi-threading Allows different parts of a single program to run concurrently Real time Responds to input instantly. General-purpose operating systems, such as DOS and UNIX, are not real-time. Operating systems can be classified as follows: Single-tasking: Generally supports only on a process at a time. Multi-user: Allows two or more users to run a program at the same time. Some operating systems permit hundreds or even thousands of concurrent users. Multi-processing: Supports running a program on more than one CPU. Multi-tasking: Allows more than one program to run concurrently. Multi-threading: Allows different parts of a single program to run concurrently. Real time: Responds to input instantly. General-purpose operating systems, such as DOS and UNIX, are not real-time.

14 Introduction and History
Bare machines -- vacuum tubes and plug boards Designed by J.W.Mauchly and J.P.Eckert of the University of Pennsylvania in 1945 No operating system Transistors and batch systems Clear distinction between designers, builders, operators, programmers, and maintenance personnel Multiprogramming 1980 to present Personal computers and workstations Network operating systems Early/First Generation Systems Bare machines -- vacuum tubes and plug boards ·        Exemplified by ENIAC (Electronic Numerical Integrator And Computer) ·        Designed by J.W.Mauchly and J.P.Eckertof the University of Pennsylvania in 1945 ·        Commissioned by the Ballistics Research Lab (BRL) at the Aberdeen Proving Ground in Maryland ·        Made up of 18,000 vacuum tubes and 1,500 relays ·        No operating system ·        Black box concept -- human operators ·         No or little protection Second Generation Systems ·        Transistors and batch systems ·        Clear distinction between designers, builders, operators, programmers, and maintenance personnel ·        Interrupts / exceptions ·         Minimal protection    Third Generation Systems ·        Multiprogramming ·        System 360 and S/370 family of computers ·        Spooling (simultaneous peripheral operation on-line) ·        Time sharing ·        On-line storage for: 1.      System programs 2.      User programs and data 3.       Program libraries ·        Virtual memory ·        Multiprocessor configurations Fourth Generation and Beyond 1980 to present ·        Personal computers and workstations ·        MS-DOS and Unix ·        Massively parallel systems ·        Computer networks (communication aspect) -- network operating systems ·        Distributed computing -- distributed operating systems

15 Network Operating Systems (NOS)
Single User DOS Applications: Local workstation only: not written to work on network. Network-Aware Applications Will work on a network, but only for a single user. Multi-user Applications Applications specifically written for networks. , Scheduling, Groupware Network operating system is an operating system that includes special functions for connecting computers and devices into a local-area network (LAN). Some operating systems, such as UNIX and the Mac OS, have networking functions built in. The term network operating system, however, is generally reserved for software that enhances a basic operating system by adding networking features. For example, some popular NOS's for DOS and Windows systems include Novell Netware, Artisoft's LANtastic, Microsoft LAN Manager, and Windows NT. Network Operating Systems (NOSs) have gone far beyond their roots of file and print services. Other functions, such as communications, database, application, and management services, have become equally important in corporate environments. NOS functions are implemented two ways: As a standalone operating environment that may or may not allow the support of additional services, such as database or electronic messaging As an additional service layered upon a general-purpose operating system, such as Unix or OpenVMS For the most part, standalone NOS environments provide the highest performance when the only requirements are file and print services. They often are less stable and robust when supporting other functions. When multifunction servers are required, layering NOS functions on a general-purpose operating system becomes the best choice.

16 Peer-to-peer is a type of network in which each workstation has equivalent capabilities and responsibilities. This differs from client/server architectures, in which some computers are dedicated to serving the others. Peer-to-peer networks are generally simpler and less expensive, but they usually do not offer the same performance under heavy loads.

17 Peer-to-Peer NOS Characteristics
Each machine sends, receives, and processes data files (Client and Server). Simplistic in design and maintenance. Used for smaller number of users (10 to 50). Used when users are in same area. Used when network growth is not an issue. Less expensive (no dedicated server). Slower and less secure than File Server

18 Security

19 Firewalls A firewall is a security mechanism that is designed to prevent unauthorized access to or from a private network. Firewalls can be hardware, software, or a hybrid of both. Firewalls act as gatekeepers, sitting between the rest of the world (i.e. the Internet and other ‘foreign’ networks) and the information that resides on your private network. Like most gatekeepers, firewalls have the ability to check the traffic that comes in and out of their domain - by looking at the network equivalent of I.D. badges. The firewall owner can determine exactly how thorough the “I.D. check” should be, and configure the firewall accordingly

20 Packet Filters There are two main types of firewalls: packet-filters and proxy servers. Packet-Filters A packet-filter looks at the IP headers of incoming network packets (a packet is basically just a chunk of digital data) and decides whether to accept or reject the packets based on predefined rules established by the firewall owner. The ‘predefined rules’ of which we speak tell the packet-filter what types of communications are allowed to enter and exit a given network If an incoming (destination) IP address is not in the list (or table) that was configured by the firewall owner, the packet will be rejected. If an outgoing packet is sent from an IP address (or source) that is not on the list, it will be rejected, and so on. Packet-filters normally reside on the router, which is the logical place for them since they are essentially just more selective, security-concerned routers. They are not able to use any information available to them beyond the information contained in the IP headers and their tables, so they are “dumb” gatekeepers. Packet-filters can be fairly effective, though less so than proxy servers. They are difficult to configure, however, and their complexity often works to create unintended security holes.

21 Proxy Server A proxy server is the “smarter” of the two basic types of firewalls. Proxy servers run a bunch of small, trusted applications, called proxies, which act as specialized relays. A proxy server might have: ·        a proxy for FTP ·        a proxy for ·        a proxy for HTTP, etc. The proxy for FTP would intercept all FTP messages that enter or leave the network, the HTTP proxy would intercept all HTTP messages that enter or leave a network…and so on. Once the proxy has intercepted a message, it will transfer that data to the internal network IF (“the big if”) that message has access ‘rights.’ Proxy servers are savvier than packet-filters, however, and they can make decisions on what to let in and out based on context, etc. A packet-filter can be likened to a minimum-wage security guard, if your name is on the (IP) list, you’re in. A proxy server is more like a bouncer at a very exclusive club in which the party inside has asked not to be disturbed. The bouncer can decide to pass your message through or turn you away depending on: ·        when you arrive ·        what you look like ·        whether or not it thinks the party you’re trying to communicate with would want to hear what you have to say! To carry this (poor) analogy further, the inside of the club is like your internal network, and only the bouncer (proxy) is visible from the outside. In this way, the goings on inside the club (your network data and true network addresses) are kept secret from the outside world. Proxy servers are more effective than packet-filters because they have a richer security policy definition set available to them (because of their ability to examine context) and because they can hide true network addresses. That is not to say that packet-filters are worthless. In fact, packet-filters and proxy servers can be combined to provide an even greater degree of security.

22 CERTIFICATE AUTHORITIES
There are four principal types of certificates: Certification Authority Server Personal Software Publisher While there are differences in the functions of the types of certificates, they all rely on the matching of certain data from different sources to provide verification. The personal certificate utilizes a key pair made of a public key and a private key. These keys are created by your browser software. The private key is forwarded to the certificate authority along with different types of information depending upon the level of certificate you are requesting. Once you have obtained and installed the certificate, users having your public key can send you encrypted data that can only be opened with the corresponding private key maintained on your computer. Public keys are made available to others on-line by querying a database on the net. A user wanting to send you sensitive information would use the public key to encrypt the message before sending it to your address. Thus if the message is intercepted, no information is available to the recipient. While there are many certificate authorities, the most popular is VeriSign, the first company to offer certificates to the public.

23 VIRUSES, BUGS AND WORMS A virus is a program that attaches itself to other files. Viruses can be funny, irritable or destructive. A bug is a flaw in a browser that can be used by a hacker to circumvent a browser’s security functions. A worm is similar to a virus in that it can be destructive or irritating. Makes copies of itself and send them to other users   VIRUSES, BUGS AND WORMS A virus is a program that attaches itself to other files. At one time it was thought only executable programs could be the host of a virus. It has been shown that this is not true as new viruses have surfaced as parts of other types of files (word processing document, , etc.). Viruses can be funny, irritable or destructive. A bug is a flaw in a browser that can be used by a hacker to circumvent a browser’s security functions. This allows the hacker to get beyond firewalls and other protective mechanisms. Upon gaining access, the intruder can make changes to or destroy data, gain access to sensitive information or create other types of havoc. A worm is similar to a virus in that it can be destructive or irritating. A worm makes copies of itself and sends them to other users where the process is repeated. It’s important to remember all of these techniques will not assure that a virus or other ailment will not occur. While vendors are continuously updating their software to address new threats, hackers and other writers of problem code are developing new methods to circumvent their protection efforts. This makes it especially important to frequently update your scanning software – typically from the vendor’s web site. In addition, users need to be alert for warnings about a new danger.

24 Protection VIRUSES, BUGS AND WORMS
There are steps that a user can use to protect themselves from these dangers. They include: Know your source of files and messages Use a virus monitoring software to scan incoming files or set the software to continuously monitor activity on your computer or server Only open files after they have been scanned Maintain frequent backups so that you can recover from a crash or other problem   VIRUSES, BUGS AND WORMS A virus is a program that attaches itself to other files. At one time it was thought only executable programs could be the host of a virus. It has been shown that this is not true as new viruses have surfaced as parts of other types of files (word processing document, , etc.). Viruses can be funny, irritable or destructive. A bug is a flaw in a browser that can be used by a hacker to circumvent a browser’s security functions. This allows the hacker to get beyond firewalls and other protective mechanisms. Upon gaining access, the intruder can make changes to or destroy data, gain access to sensitive information or create other types of havoc. A worm is similar to a virus in that it can be destructive or irritating. A worm makes copies of itself and sends them to other users where the process is repeated. It’s important to remember all of these techniques will not assure that a virus or other ailment will not occur. While vendors are continuously updating their software to address new threats, hackers and other writers of problem code are developing new methods to circumvent their protection efforts. This makes it especially important to frequently update your scanning software – typically from the vendor’s web site. In addition, users need to be alert for warnings about a new danger.

25 THIRD PARTY ASSURANCE SERVICES
WebTrust Better Business Bureau They’re a number of providers of assurance concerning web pages that a consumer might visit. While each of these services have different procedures and requirements, they all share the common goal of improving the confidence of the consumer that transactions entered into are sufficiently protected. At this time, there is no clear front runner in this area. Services that might be investigated include those provided through the AICPA and the Better Business Bureau.

26

27 e-Insurance Policies For example, if your company unwittingly spread a virus that wiped out customers’ database. ACE USA AIG Lloyd’s of London Marsh St. Paul Zurich Denial of service interruption Loss of intellectual property Hackers

28 Relational Database Management Systems

29 Relational Database Management Systems (RDBMS)
A type of database management system that stores data in the form of related tables. The data is integrated into a single conceptual model and a single location The data is independent from the application programs The data is shared  Today, most leading accounting software manages data almost exclusively on RDBMS technology The Relational Database Management Systems (RDBMS) is a type of database management system that stores data in the form of related tables. Relational databases are powerful because they require few assumptions about how data is related or how it will be extracted from the database. As a result, the same database can be viewed in many different ways. In short, a RDBMS database is where data is collected for the purpose of being analyzed. A relational database has three important characteristics: 1.      The data is integrated into a single conceptual model and a single location 2.      The data is independent from the application programs 3.      The data is shared  Today, most leading accounting software manages data almost exclusively on RDBMS technology. Every RDBMS uses some sort of structured query language (SQL) to create, maintain, and query the database Both RDBMS and SQL emerged from research at IBM labs during the 1970s and from the work of Dr. E.F. Codd. The commercial realization of Cobb’s work is a database architecture that is both adaptable to change and easy to access, solving three important problems of traditional mainframe networks and hierarchical databases and of other positional file management systems.

30 Basic Feature of RDBMS Data Abstraction Users deal with conceptual representation of the data that includes little control over where the data is stored. Basic Database Architecture Software layers insulate user from low-level details Meta-data (data catalog) Support for Multiple Users Sharing data Concurrency usage Transaction processing Multiple data views Support Various Types of Users Database Administrator (DBA) Database design End users Casual end users Sophisticated end users Multiple Ways of Interface to the Systems Query language Programming language Forms and command codes Menu-driven DBMS Interfaces Graphical Users Interfaces (GUI) Forms based interfaces Natural language interfaces Function keys DBA interfaces DBMS Component Stored database System catalog/data directories Stored data manager DDL compiler Run-time database processor Query compiler Relational databases store data in a two dimensional format: Two dimensional: Table of data represented by rows and columns Multi-dimensional: On-Line Analytical Processing (OLAP) solutions represented by highly indexed or summarized designs

31 Data Marts Data marts are workgroups or departmental warehouses, which are small in size, typically less than 10GB Meta Data 1.      The technical data contains a description of the operational database and a description of the data warehouse 2.      The business data Departmentalized Data Information - Data Marts: Data marts are workgroups or departmental warehouses, which are small in size, typically less than 10GB. The data mart information, which is departmentalized, tailored to the needs of the specific departmental work group. Managing Information about the Warehouse - Meta Data Meta data is information (data about data) about the data warehouse and the data that is contained in the data warehouse. There are technically two forms of Meta data: 1.      The technical data 2.      The business data Technical Data contains a description of the operational database and a description of the data warehouse. This data helps the data administrators maintain the data warehouse and know where the data is coming from. Business Data helps the users find information in the data warehouse without knowing the underlying implementation of the database From a technical standpoint, RDBMSs can differ widely. The internal organization can affect how quickly and flexibly you can extract information. Requests for information from a database are made in the form of a query, which is a stylized question. For example, the query SELECT ALL WHERE NAME = "WATSON" AND AGE > 37 requests all records in which the NAME field is WATSON and the AGE field is greater than 37. The set of rules for constructing queries is known as a query language. Different DBMSs support different query languages, although there is the semi-standardized query language called SQL (structured query language). Sophisticated languages for managing database systems are called fourth-generation languages, or 4GLs for short. The information from a database can be presented in a variety of formats. Most RDBMSs include a report writer program that enables you to output data in the form of a report. Many RDBMSs also include a graphics component that enables you to output information in the form of graphs and charts.

32 Relational Accounting and Transaction Triggers
A trigger is a piece of code stored in the database. Row-level triggers can be executed BEFORE or AFTER each row is modified by the triggering insert, update, or delete operation. Statement-level triggers execute after the entire operation is performed Relational Accounting and Transaction Triggers The RDBMS and the SQL models offer specific features useful for managing accounting data with imbedded business rules called triggers. A trigger is a piece of code stored in the database. Triggers can be defined as row-level triggers or statement-level triggers. Row-level triggers can be executed BEFORE or AFTER each row is modified by the triggering insert, update, or delete operation. Statement-level triggers execute after the entire operation is performed. Flexibility in trigger execution time is particularly useful for triggers that rely on referential integrity actions such as cascaded updates or deletes being carried out, or not, as they execute. If an error occurs while a trigger is executing, the operation that fired the trigger fails. INSERT, UPDATE, and DELETE are atomic operations. When they fail, all effects of the statement (including the effects of triggers and any procedures called by triggers) are undone.

33 Why would accountants benefit from triggers?
Recording the name of user who tried to change an account code Recording the data and time a transaction occurred Ensuring that a transaction is in balance before it is posted to the ledger Alerting the accountant to a budget overrun by sending an Printing out an audit trail Warning a user of unposted transactions before a report is to be run Checking that codes added to one table exist in other related code tables Deleting data from an address table when its customer owner is deleted

34

35 Benefits of RDBMS Scalability Transaction integrity
Centralized business rules Centralized data allows real-time accuracy in transaction processing since the single data occurrence is always updated Shared data eliminates inconsistencies when data is stored in several places and not updated in all locations     Shared data means that all application programs use the same data    More informed decision making, based on one corporate database Improved cost efficiencies Higher level of customer service Enhanced asset/liability management        Drawbacks of RDBMS One of the major drawbacks of RDMBS is that in most cases it requires you to buy a separate license from your accounting software license to run the RDBMS database. Another cost factor to consider is that RMBMS systems requires a full- or part-time database administrator (DBA) on staff because extracting the best performance requires constant monitoring, maintenance, and tuning by professional staff. For best performance, RDBMSs also need high amounts of spare memory and disk capacity. In summary, the RDBMS systems relative to the traditional accounting systems are more expensive than alternative data management systems to operate. Despite some of the drawbacks as mentioned, RDBMS systems have more advantages than disadvantages for businesses that expect rapid growth or demand better decision reports from their accounting systems. Some of the benefits are:

36 Data Warehousing Bill Inmon coined the term "data warehouse" in His definition is: "A (data) warehouse is a subject-oriented, integrated, time-variant and non-volatile collection of data in support of management's decision making process." The definition of a Data Warehouse varies depending on to whom you talk to. However, a general consensus exists that the purpose of a Data Warehouse is to provide end users with easy access to large amounts of company data to assist with operational and strategic business issues Subject-Oriented – Data that gives information about a particular subject instead of about a company’s on-going operations. Integrated – Data that is gathered into the data warehouse from a variety of sources and merged into a coherent whole. Time-variant – All data in the data warehouse is identifiable with a particular time period. Non-volatile – Data is stable in a warehouse. More data is added, but data is never removed. This enables management to gain a consistent picture of the business. To a number of people, a data warehouse is any collection of summarized data from various sources, structured and optimized for query access using OLAP (on-line analytical processing) query tools. The vendors of OLAP tools originally propagated this view. To others, a data warehouse is virtually any database containing data from more than one source, collected for the purpose of providing management information. This definition is neither helpful nor visionary, since such databases have been a feature of decision support solutions since long before the coining of the term "data warehousing”.

37 Information is stored in files or tables.
The emerging problem is not how to retrieve data, but how to manage, utilize and optimize the mountains of data our increasingly efficient information systems are collecting. Within this explosion, the challenge of data warehousing has become evident. Terminology: A file is made up of a number of records. A record is made up of a number of fields, each of which has a specific identified name. The emerging problem is not how to retrieve data, but how to manage, utilize and optimize the mountains of data our increasingly efficient information systems are collecting. Within this explosion, the challenge of data warehousing has become evident. In a perfect world there would be one kind of database system that could handle information of any complexity and which allowed infinitely flexible access to it. In the real world what has happened is that a number of different types of systems have been implemented to deal with particular types of information warehousing. As with most computer-based advances, there is new terminology to learn. Information is stored in files or tables. A file is made up of a number of records. A record is made up of a number of fields, each of which has a specific identified name. As an illustration of this terminology and the sort of information that could reasonably be stored in a database, consider the main body of a telephone directory. The body itself is analogous to a file, each line in it is a structured record made up of fields containing (from left to right) name, address and telephone number information. Once a specific file has been defined, a database system enables information to be input, modified and retrieved on users’ demand. For example, given information in the `telephone numbers' files above, reasonable requests of a database system might be: ·        Display all records where the Name field starts with `IMA' ·        Display the Name, Address, Town and Tel No. of records where the Name starts with `IMA' ·        Display the Name, Address and Tel No of records where the Name starts with `IMA' AND where the Town is `Montvale’ Boolean queries (several conditions linked using AND, OR and NOT) as illustrated in the last request above are almost universally supported by retrieval systems. In order to speed up the execution of queries, indexes may be created, either automatically or on request. Once retrieved entire records or specified parts of them can be output in a format specified by the enquirer. Some database systems, particularly those concerned with multi-user support, differentiate between different classes of users. Most importantly they recognize database administrators and end users. The database administrator has overall control of a database and is responsible for its design, the implementation of data validation and security matters. A participant interacts with the database, retrieving information and possibly modifying the content of the database subject to the constraints imposed by the database administrator. Enterprise storage is now more than a necessary IT component. It is a key architectural element for both data warehousing and transaction systems that allows rapid response to sudden growth and change. It creates an infrastructure that encourages the adoption of new technologies. Basically, it eases management headaches.

38 What, Then, Is a Data Warehouse?
Data warehouses assemble the data from heterogeneous databases so that users query only a single point Internet and World Wide Web technologies have had a major impact on data management. Many vendors now have interfaced their data warehouses to the Web Intelligent agent technology will play a major role in locating and integrating various data sources on the Web A data warehouse brings together the essential data from the heterogeneous databases, so that users need to query only the warehouse A data warehouse for managing data has become a critical need in many enterprises. It is often a complicated process to access data from heterogeneous databases. Several processing modules need to cooperate with each other for processing a query in a heterogeneous environment. ·        A data warehouse brings together the essential data from the heterogeneous databases, so that users need to query only the warehouse ·        Data warehouses assemble the data from heterogeneous databases so that users query only a single point ·        Internet and World Wide Web technologies have had a major impact on data management. Many vendors now have interfaced their data warehouses to the Web ·        Intelligent agent technology will play a major role in locating and integrating various data sources on the Web Data warehousing can be the key differentiate in many different industries.

39 Data warehouse applications include:
Sales and marketing analysis across all industries Inventory turn and product tracking in manufacturing Category management, vendor analysis, and marketing program effectiveness analysis in retail Profitability analysis or risk assessment in banking Claim analysis or fraud detection in insurance Data warehouse applications include: ·        Sales and marketing analysis across all industries ·        Inventory turn and product tracking in manufacturing ·        Category management, vendor analysis, and marketing program effectiveness analysis in retail ·        Profitability analysis or risk assessment in banking ·        Claim analysis or fraud detection in insurance D.               Vendors and Products The concept of relational database is very important to accountants. Over the past decade, the RDBMS has become the technology of choice for managing data generated by online transactions processing. Relational database management systems are composed of objects or relations. They are managed by operations and governed by data integrity constraints. Oracle Corporation, for example, produces products and services to meet your relational database management system needs. The main product is the Oracle8 Server, which enables you to store and manage information by using SQL and the PL/SQL engine for procedural constructs. Other relational database management products that are effectively in use in support of current business operations in the Commonwealth are: ·        Oracle ·        Informix ·        Microsoft SQL Server ·        DB2

40 Business Intelligence Tools

41 What is Business Intelligence Tools?
Business Intelligence Tools is a kind of software that gives users the ability to access and analyze information that resides in databases throughout an enterprise. IT Systems that are designed specifically to meet the needs of the knowledge workers. Business Intelligence Tools is a kind of software that gives users the ability to access and analyze information that resides in databases throughout an enterprise. Knowledge is the most valuable commodity in the Information Age. Many companies are discovering that they have a great deal of valuable knowledge hidden in the data they already possess. This knowledge is hidden because there is too much data, the data is not organized and, sometimes, the data is contradictory. Business Intelligence Tools can transform this data into useful knowledge. It can help you create databased knowledge business intelligence software tools and the process of data warehousing. The following three types of tools are referred to as Business Intelligence Tools: Multi-Dimensional Analysis Software - Also known as Multi Software or OLAP (On-Line Analytical Processing) - Software that gives the user the opportunity to look at the data from a variety of different dimensions. Query Tools - Software that allows the user to ask questions about patterns or details in the data. Data Mining Tools - Software that automatically searches for significant patterns or correlations in the data.

42 BIT v. OP Built to enable exploration analysis, and presentation of information Relatively few inquires which are often wide in scope Designed to get data out Used to automat routine, predictable tasks Large volume of small transactions that are limited in scope Designed to get data in

43 The following are three types of Business Intelligence Tools
Multi-Dimensional Analysis Software - Also known as Multi Software or OLAP (On-Line Analytical Processing) - Software that gives the user the opportunity to look at the data from a variety of different dimensions.   Query Tools - Software that allows the user to ask questions about patterns or details in the data. Data Mining Tools - Software that automatically searches for significant patterns or correlations in the data. Business Intelligence Tools can transform this data into useful knowledge. It can help you create databased knowledge business intelligence software tools and the process of data warehousing. The following three types of tools are referred to as Business Intelligence Tools: Multi-Dimensional Analysis Software - Also known as Multi Software or OLAP (On-Line Analytical Processing) - Software that gives the user the opportunity to look at the data from a variety of different dimensions. Query Tools - Software that allows the user to ask questions about patterns or details in the data. Data Mining Tools - Software that automatically searches for significant patterns or correlations in the data.

44 Data Query and Reporting Tools
Multi-Dimensional Analysis Software Also known as: OLAP (On-Line Analytical Processing) Is the process of analysis that involves organizing and summarizing data in a multiple number of dimensions. Query and reporting software tools fill the functional gap between corporate data repositories and the OLAP software tools, which can analyze such data. They conveniently create requests for needed information, access that data, and then deliver it in a usable informative report—all controlled at the desktop with no programming skills required. Cognos is one of the few software vendors that integrates intelligence tool that allow querying and reporting through the same interface. Multi-Dimensional Analysis Software Also known as: OLAP (On-Line Analytical Processing) OLAP (On-Line Analytical Processing) is a process of analysis that involves organizing and summarizing data in a multiple number of dimensions. In other words, OLAP is the use of computers to analyze an organization's data. "OLAP" is the most widely used term for multi-dimensional analysis software. The term "On-Line Analytical Processing" was developed to distinguish data warehousing activities from "On-Line Transaction Processing" - the use of computers to run the on-going operation of a business. In its broadest usage the term "OLAP" is used as a synonym of "data warehousing". In a more narrow usage, the term OLAP is used to refer to the tools used for Multi-Dimensional Analysis. "Think of an OLAP data structure as a Rubik's Cube of data that users can twist and twirl in different ways to work through what-if and what-happened scenarios." – Lee, The Editor, Datamation (May 1995)

45 OLAP “Value Added Decision Support”
"Think of an OLAP data structure as a Rubik's Cube of data that users can twist and twirl in different ways to work through what-if and what-happened scenarios." – Lee, The Editor, Datamation (May 1995)

46 New Analytical Approach
One of the most prominent and pervasive alternative approaches to managing analytical data lies in the relationship between knowledge management (KM) and decision support. The ability of any user, anywhere, to ask any questions of any database, at anytime.

47 Data Mining Concept and Useful Terminology I
Data Mining is the process of finding hidden patterns and relationships in the data Data Mart is a database that has the same characteristics as a data warehouse, but is usually smaller and is focused on the data for one division or one workgroup within an enterprise. Data Mining is the process of finding hidden patterns and relationships in the data. Analyzing data involves the recognition of significant patterns. Human analysts can see patterns in small data sets. Specialized data mining tools are able to find patterns in large amounts of data. These tools are also able to analyze significant relationships that are apparent only when several dimensions are viewed at the same time. Users can ask data questions using standard queries when they know what they're looking for. Queries can be written for questions like this: "Which of our out-of-town customers have given us the most business in the last year?“ Data mining is needed when the user's questions are more vague or general in nature. Data mining questions would include: "What attributes characterize the customers that gave us the most business in the past year?“ Organizations are gathering and storing more and more data. Every year the amount of data in the world is approximately doubling. This data is of little benefit unless it can be turned into useful information and knowledge. The goal of business intelligence and data warehousing - changing data into information and knowledge. Information by itself is an inadequate basis for business decisions because the amount of information, like the amount of data, is overwhelming. Business Intelligence Tools are designed to find what is significant - what really adds to our useful knowledge - in the piles of data and information. Data Mart Also known as: Local Data Warehouse Data Mart is a database that has the same characteristics as a data warehouse, but is usually smaller and is focused on the data for one division or one workgroup within an enterprise.

48 Data Mining Concept and Useful Terminology II
Data Migration is the movement of data from one environment to another. This happens when data is brought from a legacy system into a data warehouse. Data Mining is the process of finding hidden patterns and relationships in the data. Data Migration Data Migration is the movement of data from one environment to another. This happens when data is brought from a legacy system into a data warehouse. Data Mining Data Mining is the process of finding hidden patterns and relationships in the data.

49 Drill Down and Drill Up Drill Down and Drill Up is the ability to move between levels of the hierarchy when viewing data with a multi. Multi-dimensional analysis tools (multis) organize the data in two primary ways: in multiple dimensions and in hierarchies. Drilling down and drilling up allow an analyst to move down and up the hierarchies to see how the information at the various levels is related. After looking at the sales totals for a department store, the analyst may want to drill down to see the individual sales for each employee in one of the departments. Then the analyst may choose to drill up to view how this store's total sales compare to other stores in the same region. Hierarchy Hierarchy is Organization of data into a logical tree structure. Most Multi-dimensional data base structures allow dimensions to be organized into hierarchies. It is the combination of a multi-dimensional with a hierarchical view in Business Intelligence Software that allows users to grasp large amounts of data. Moving between the levels of a hierarchy is called drilling up and drilling down.

50 Benefits of BI Supply chain management Fraud management
Risk management Product management Financial controls The quest for competitive edge is driving organizations to develop increasingly innovative strategies, to differentiate themselves from the competition and improve performance. Providing timely and reliable information from data already contained within your business is critical to this development process. It is no longer enough to rely on instinct or business intuition to achieve market leadership. The economy is becoming knowledge based – and those with the best information will win. Meeting global competition; boosting speed-to-market; gaining market share; pursuing closer customer relationships; improving margins; reducing costs. Business intelligence has a wide range of applications that can be used to increase efficiency and effectiveness of many of an organization’s internal processes. For example: q       Supply chain management Business intelligence provides full information about inventory levels and logistics across the whole supply chain process, end to end, giving management better control over its impact on cash flows, costs and customer satisfaction. q       Fraud management By providing access to extremely high volumes of detailed data, a business intelligence solution enables the detection of fraudulent behavior by analyzing transaction patterns over time. q       Risk management Business intelligence provides the means to better validate risks by enabling the analysis of historical data to create risk profiles of customers, against which new customers can be evaluated. q       Product management Business intelligence can provide quick and accurate feedback on how successful product decisions have been. q       Financial controls Business intelligence can be used to improve margins and reduce costs. With detailed information about all of the company's activities it is possible to identify which products, customers, and geographies are the most profitable.

51 Transaction Processing

52 Transaction Processing
Transaction processing means that master files are updated as soon as transactions are entered at terminals or received over communication lines. Batch Processing Versus Transaction Processing A transaction processing (TP) system, also called online or real time systems, consists of computer hardware and software hosting a transaction processing application to perform routine transactions necessary to conduct business. Examples include: ·        systems that manage sales order entry ·        airline reservations ·        payroll ·        employee records ·        shipping Transaction processing means that master files are updated as soon as transactions are entered at terminals or received over communication lines. For example, if you save receipts in a shoe box and add them up at the end of the year for taxes, that is batch processing,. However, if you buy something and immediately add the amount to a running total, that is called transaction processing. Batch Processing Versus Transaction Processing Transaction Processing Systems process detailed data necessary to update records about fundamental business operations. There are basically two principle ways to process data: Batch Processing 2. Online Processing Batch Processing involves holding transactions and processing them all at once, in batches. Online Processing involves processing the transactions individually, often at the same time they occur. . A transaction processing application leverages the features of a transaction processing system. Developing such an application is extremely complex and can be hard to scale. To streamline application development, software vendors began producing transaction-based software specifically designed to manage system-level services. Two such product categories are: 1.      Database servers 2.      TP monitors. Database Servers Most relational database management systems (RDBMS) provide transaction-processing features. Clients call stored procedures in a database to make transaction-protected requests. Database servers work well in a two-tier client/server implementation, as long as the request volume remains low.

53

54 Characteristics of Transaction Processing Systems
Provide fast, efficient processing to handle large amount of input and output Perform rigorous data editing to ensure that records are accurate and up to date  Are audited to ensure that all input data, processing, procedures, and output are complete, accurate, and valid Involves a high potential for security-related problems  Support work processes of large number of people; loss of the system can cause a severe and negative impact on the organization Characteristics of Transaction Processing Systems Provide fast, efficient processing to handle large amount of input and output Perform rigorous data editing to ensure that records are accurate and up to date Are audited to ensure that all input data, processing, procedures, and output are complete, accurate, and valid Involves a high potential for security-related problems Support work processes of large number of people; loss of the system can cause a severe and negative impact on the organization

55 Transaction Process Monitors
TP Monitor makes sure that groups of updates take place together or not at all This also supports the four TP requirements: Atomicity Consistency Isolation Durability A Atomicity, Consistency, Isolation, and Durability B. Transaction Process Monitors A TP Monitor is a subsystem that groups’ together sets of related database updates and submits them together to a relational database. The result is that the database server does not need to do all of the work of managing the consistency/correctness of the database; the TP Monitor makes sure that groups of updates take place together or not at all. The advantages of this include increased system robustness as well as throughput. This also supports the four TP requirements: Atomicity All transactions are either performed completely - committed, or are not done at all; a partial transaction that is aborted must be rolled back. Consistency The effects of a transaction must preserve required system properties. For instance, if funds are transferred between accounts, a deposit and withdrawal must both be committed to the database, so that the accounting system does not fall out of balance. Isolation Intermediate stages must not be made visible to other transactions. Thus, in the case of a transfer of funds between accounts, both sides of the double-entry bookkeeping system must change together. This means that transactions appear to execute serially (e.g. in order) even if the work is done concurrently. Durability Once a transaction is committed, the change must persist, except in the face of a catastrophic failure.

56 1. Atomicity All transactions are either performed completely - committed, or are not done at all; a partial transaction that is aborted must be rolled back.

57 2. Consistency The effects of a transaction must preserve required system properties. For instance, if funds are transferred between accounts, a deposit and withdrawal must both be committed to the database, so that the accounting system does not fall out of balance.

58 3. Isolation Intermediate stages must not be made visible to other transactions. Thus, in the case of a transfer of funds between accounts, both sides of the double-entry bookkeeping system must change together. This means that transactions appear to execute serially (e.g. in order) even if the work is done concurrently.

59 4. Durability Once a transaction is committed, the change must persist, except in the face of a catastrophic failure.

60 Elements of the Transaction Processing
Accounting cycle Ledgers Journals Trial balances Coding Reports Source documents

61 Transaction Processing Cycle
Transaction Processing Capabilities q       Distributed Transaction Processing Analysis and Design q       Operational Data Modeling q       Operational Database Schema Design and Implementation q       Transaction Services Analysis and Design q       Provide fast, efficient processing to handle large amounts of input and output q       Perform rigorous data editing to ensure that records are accurate and up to date q       Are audited to ensure that all input data, processing, procedures, and output are complete, accurate, and valid Transaction Processing for Competitive Advantage Competitive Advantage Example Customer loyalty increased Use of customer interaction system to monitor and track each customer interaction with the company Superior services provided to customers Use of tracking system that are accessible by customer to determine shipment status Superior information gathering performed Use of order configuration system to ensure that products and services ordered will meet customer’s objectives Cost dramatically reduced Use of warehouse management system employing scanners and bar-code product to reduce labor hours, and improve inventory accuracy. OLTP (online transaction processing) OLTP (online transaction processing) is a class of program that facilitates and manages transaction-oriented applications, typically for data entry and retrieval transactions in a number of industries, including banking, airlines, mail-order, supermarkets, and manufacturers. Probably the most widely installed OLTP product is IBM's CICS (Customer Information Control System). Today's online transaction processing increasingly requires support for transactions that span a network and may include more than one company. For this reason, new OLTP software uses client-server processing and brokering software that allows transactions to run on different computer platforms in a network. Capabilities Distributed Transaction Processing Analysis and Design Operational Data Modeling Operational Database Schema Design and Implementation Transaction Services Analysis and Design

62 Document Image Processing

63 Imaging Processing is the automated technology of:
Image processing systems automate and streamline the flow of paper through an organization. Imaging is the automated technology of: Document Storage Document Management Document Retrieval Document Communication. Image processing systems automate and streamline the flow of paper through an organization. Imaging is the automated technology of: q       Document storage q       Document Management, q       Document Retrieval q       Document Communication. The explosive growth of Document Image Processing (DIP), has had a huge impact on business in the 90’s. Its relevance to the reengineering of business processes and its impact on workflow, document management, and electronic mail make it a technology no-one can no longer afford to ignore. Image Processing systems are good for: q       Scanning large amounts of paper-based documents into a computer system and indexing them for quick and easy retrieval. q       Processing all types of documented information either graphical, full-text of combinations of both. q       Converts images into a digital format via the document scanner and then stores these images onto a mass storage device, usually a "write-once" optical mechanism. The digital image is normally written to an optical disk, which can hold a large amount of document images depending upon your disk capacity. Systems usually can access many optical disks hence there is no real limit to the storage capacity of a single system. The recommended media for use with DIP systems is WORM (Write Once Read Many). As you might expect, these disks enable you to store information but not overwrite it. The use of this type of media protects from accidental loss of Information by overwriting. The inability to delete information can also be useful where you need to keep an audit trail of documents. If you want to save a document again after making changes, it will have to be saved alongside the original document rather than replacing it. This method of storage is preferred both for legal and tax reasons as well as possibly internal security. Some of the latest Optical drives available will allow the use of both WORM and Re-writable media.

64 Capturing Documents With the use of a digitizing scanner that looks and acts very much like a normal photocopier. The document is digitized using the same techniques employed in fax machines and therefore, whatever can be written or drawn on a piece of paper can be captured. There is a wide range of scanners available for use with DIP systems. Most DIP users will probably employ a mono flatbed scanner with a sheet feeder. Speeds can range from a few pages per minute for low throughput, to between 20 and 180 or more pages per minute for heavy use. Specialist scanners are also available for color work and A0-size documents. Most serious DIP systems will allow users to scan in batch mode before indexing takes place. Retrieving Image Documents When the documents are captured, they must be indexed on the system. Indexing involves entering data onto an index page within the database, with references unique to the specific documents, i.e. an Insurance policy holders name, their postcode, the policy number, the policy type and the issued date. Future retrievals could then be via any or a combination of these fields. The types of searches, which may be performed, include search by form and full-text retrieval. The later allows every word associated to the document (without quantity or length restrictions) to be used as a search key. Optical Character Recognition (OCR) and Imaging OCR is a method, which involves the computer 'reading' the words on an image and converting the text into a form, which can then be processed, by the computer. OCR can be used to create more relevant indexes by reading specific areas of the image to extract information such as date, customer name, delivery address. A variety of packages are available which run under windows, some specialist packages can even be trained to recognize handwriting and signatures. Imaging versus Microfilm Imaging is easier and cheaper to use and does not require the chemical processes of microfilm. In addition, imaging can produce far better reproductions of the original documents. Most importantly, digital document images can be transported around an organization very quickly over a local or wide area network. Even the fax can be integrated into the system. Added to this is the enhanced security of limited access rights and date recording, available with optical storage systems. The Legality of Digitized Images There is no definitive solution to this problem. Under civil law the best available evidence is usually applied. No legal precedent has yet been established for the acceptance of digitized images, however legal opinion seems to be that they are at least as acceptable as microfilm, if not more so. It is believed, that ultimately the law will be forced to recognize this new technology in the same way it has microfilm. Best Practices of Document Imaging Systems Any documents that have to be stored for more than say, six months, that are referred to regularly by more than one person and where their storage and handling requires space and human resource. Typical organizations currently using imaging systems include: q       Insurance Companies q       Banks, Building Societies q       Airlines q       Government Agencies

65 Processing systems are good for:
Scanning large amounts of paper-based documents into a computer system and indexing them for quick and easy retrieval. Processing all types of documented information either graphical, full-text of combinations of both. Converts images into a digital format via the document scanner and then stores these images onto a mass storage device, usually a "write-once" optical mechanism.

66 Document Image Processing Within the Accounting Office
Application Module Type of Image General Ledger Cash transfer and deposit slips, wire transfer request Accounts Receivable Sales invoices, checks paid, expense reports Accounts Payable Invoices, canceled checks, expense time sheets Purchasing Purchase orders, price lists and brochures, agreements Inventory Pictures of items, copies of insurance or title contracts Fixed assets Pictures of assets, copies of insurance or title contracts Human resources Pictures of employees/applicants, resumes, and professional/educational certificates.

67 Benefits Saves time by retrieving documents quickly and efficiently at your terminal. Saves time by allowing documents to be shared electronically. Saves money by releasing valuable filing space. Saves money by using electronic forwarding instead of photocopying. Saves money by increasing staff productivity. Enhances customer service by timely and accurate retrieval of information. Be it customer contracts, invoices proof of delivery forms, sales and purchase orders or general correspondence, your documents will be displayed or printed quickly and efficiently and even transferred via fax modem or electronic mail. Imaging: q       Saves time by retrieving documents quickly and efficiently at your terminal. q       Saves time by allowing documents to be shared electronically. q       Saves money by releasing valuable filing space. q       Saves money by using electronic forwarding instead of photocopying. q       Saves money by increasing staff productivity. q       Enhances customer service by timely and accurate retrieval of information.

68 GroupWare and Workflow

69 GroupWare and Workflow Components and Concepts
Describe a rapidly evolving collection of software tools that have been developed to enable more efficient human collaboration. Groupware aids in the: Creation Sharing And tracking of unstructured information within and between organizations in support of collaborative activity A. Components and Concepts “Groupware” is a very nebulous term that is used to describe a rapidly evolving collection of software tools that have been developed to enable more efficient human collaboration. The key word here is “collaboration.” Groupware aids in the: ·        Creation, ·        Sharing, ·        And tracking of unstructured information within and between organizations in support of collaborative activity.

70 The Major Vendors are: Lotus Notes/Domino, Microsoft Exchange,
Novell Groupwise, and Netscape SuiteSpot/Collabra Groupware is sometimes even called “collaborative computing.” There are currently over 500 products that call themselves “groupware.” The major vendors are: Lotus Notes/Domino, Microsoft Exchange, Novell Groupwise, and Netscape SuiteSpot/Collabra

71 Groupware can be said to encompass at least six core technologies:
Multimedia electronic document management systems (EDMS) Electronic conferencing systems Electronic scheduling systems Electronic mail systems Telephony Workflow systems Groupware can be said to encompass at least six core technologies: multimedia electronic document management systems (EDMS) electronic conferencing systems electronic scheduling systems electronic mail systems telephony workflow systems Groupware gains its power from the synergy of these technologies – the tools in concert are much more useful and flexible than any one of them could be individually.

72 Pitfalls in implementing a groupware system
As the system become functionally broader, they become more difficult to implement in an ordered, logical fashion Expectations of the system Training On-screen graphical flowchart of the module By double-clicking a step in the flowchart takes the user too the appropriate function in the application. Ability to Bolster or Hinder Reengineering Efforts Most GroupWare implementations are found to result in significant returns on investment. The research firm IDC conducted a study in 1994 that found groupware implementations with a median investment of $100,000 resulted in ROIs of between 16% and 1666% percent. More than half of the companies surveyed indicated returns in excess of 100%, and more than a quarter showed returns in excess of 200%. IDC was so flabbergasted by these figures that they repeated the study with a completely different group of users…and obtained almost identical results! Most groupware ROI studies base their investment cost figures on the total cost of implementation which includes not only software and equipment, but also training and support costs, application development costs, etc. For every $1 you spend on actual groupware hardware and software, you should expect to spend an additional $____ to $____ on training, support, and application development. What types of companies typically benefit the most from groupware implementations? ________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

73 Document The basic instrument of storage in a groupware system is the document. A document is the abstract “box” that holds all of the unstructured data you will use in the groupware system. A document is the equivalent of the table in SQL databases, except that it holds unstructured (or “semi-structured”) data instead of highly structured relational data.

74 Document Databases. Research notes Blueprints Financial statements
Vacation photos Videos Voic s Bulletin boards Faxes s. The basic instrument of storage in a groupware system is the document. A document is the abstract “box” that holds all of the unstructured data you will use in the groupware system. A document is the equivalent of the table in SQL databases, except that it holds unstructured (or “semi-structured”) data instead of highly structured relational data. The ability to handle unstructured data adeptly is one of the things that set groupware apart from SQL databases. Every type of text and image imaginable can be put into a document without any structure whatsoever. Try doing that with a regular RDBMS. A document could conceivably contain: ·        research notes ·        blueprints ·        financial statements ·        vacation photos ·        videos ·        voic s ·        bulletin boards ·        faxes ·        s Documents, like tables, are just building blocks. They are used to create document databases. Document databases can then be shared among any users on the groupware system. Groupware systems give you everything you need to manage documents including search, query, and navigation tools. You can index and retrieve documents in a document database by the document’s properties (you could run a query for all the documents that contain video footage, for example) or by actual document content (the photo of you dancing on the boss’s desk at the 1996 office Christmas party.) Multimedia Electronic Document Management Systems have their origins in basic document imaging solutions, which are covered in another chapter of this course.

75 Electronic Conferencing
Asynchronous Versus Real-time Text-based Teleconferencing Electronic whiteboards Data conferencing Electronic Conferencing Electronic conferencing can be divided into two basic camps: asynchronous and real-time. The ability to handle asynchronous communication is one of the underlying threads that ties together all of the components of groupware. It is also one of the technologies that imbue groupware with its flexibility. Using asynchronous conferencing, teams can participate in discussions wherever and whenever they want. This technology offers a means of storing and organizing the collective knowledge of a group in a meaningful manner. While many assume that real-time communication is preferable to asynchronous communication, this is not necessarily the case. 1.       Asynchronous communication allows one more time to reflect and organize his/her thoughts before making a contribution, which can result in a higher quality conversation. 2.       Asynchronous communication also provides time for participants to collect any information they need to contribute from other sources. 3.       ‘Shy’ participants are also more likely to contribute in asynchronous conferences, because much of the trauma they feel in trying to speak around more vocal participants is dispelled. 4.       Everyone has the opportunity to express himself (or herself) in asynchronous conferencing, something that cannot be said of real-time conferencing. The information collected in asynchronous conferences is searchable, and easily organized around specific topics of interest. This technology is usually text-based, although teleconferencing, and electronic whiteboards offer these (and other) benefits at a substantially higher cost. Data conferencing is a phrase used to describe electronic conferences in which digitized audio/video messages are transmitted along with numerical data (such as stock quotes, etc.)

76 Internet Relay Chat (IRC) and ‘Usenet’
The tools at the disposal of the viewer/participant will usually include: A mechanism to create a new topic or a new thread A mechanism for responding to an existing thread A user/topic/date search utility. Internet Relay Chat (IRC) and ‘Usenet’ are examples of text-based conferencing, the former being near real-time and the latter being asynchronous. The text-based variants normally take advantage of threaded discussions, in which postings to the conference are grouped according to topic by thread. A thread is a collapsible structure in which one topic is posted as a header and subsequent postings pertaining to that header (replies) are indented below it. Postings that develop around sub-topics within an ongoing thread can be nested within the overall structure using further indentation. The tools at the disposal of the viewer/participant will usually include: ·        a mechanism to create a new topic or a new thread ·        a mechanism for responding to an existing thread ·        and a user/topic/date search utility. The user/topic/date search utility allows a participant to quickly sort message threads by the author of the message, the topic of the message, or the date the message was created. Some text-based systems will notify you if certain conditions are met within an ongoing conference, so that you do not have to keep checking back with the conference. For example, if you post a question to a conference and are only interested in obtaining an answer, you can set the system up so that it will automatically you when someone answers your question. If your asynchronous, text-based conferencing system is not integrated with this utility will not work, and you will have to periodically “check in” to monitor the status of the conference.

77 Real-Time Conferences
Allow team members to collaborate on a project using near-instantaneously refreshed document replicas, electronic whiteboards, and audio/video communication tools such as teleconferencing systems Real-time conferences allow team members to collaborate on a project using near-instantaneously refreshed document replicas, electronic whiteboards, and audio/video communication tools such as teleconferencing systems. Generally, a conference participant is designated as a moderator and he/she controls the flow of messages so that people do not talk over each other. The moderator may also control who has access to the shared document in the whiteboard application, as well. The electronic whiteboard acts as a working space for brainstorming and other types of drawn visual communications. It is drawn on using a stylus or other input device and the annotation is replicated in real-time to all other conference participants. Whiteboard applications are normally incorporated as part of a videoconferencing application, but can be used as standalone applications. Real-time whiteboard applications are sometimes called document conferencing tools.

78 Electronic Scheduling
Electronic scheduling is a means of sharing information about meetings, deadlines, and “To Do” lists between personnel and project members. A project member wanting to schedule a meeting would simply: View the calendars of the desired participants Choose the best time, date, and location for the meeting Write the meeting information directly to the calendars of meeting participants Send reminder notifications via . Electronic scheduling is one of the simplest technologies utilized in groupware, but one that holds an enormous potential for increasing organizational efficiency. Electronic scheduling is a means of sharing information about meetings, deadlines, and “To Do” lists between personnel and project members. A project member wanting to schedule a meeting would simply: 1.      view the calendars of the desired participants 2.      choose the best time, date, and location for the meeting 3.      write the meeting information directly to the calendars of meeting participants 4.      send reminder notifications via . The better scheduling programs will automatically tell you the optimal time for meetings once you designate the participants, saving you the trouble of trying to match schedules. Such programs will also generate the reminder notifications automatically. Electronic scheduling programs can also automatically re-route tasks to other qualified workers if a project member is sick, on vacation, or otherwise unavailable. This function alone can save projects from running behind schedule and over-budget

79 E-Mail Features Address Books Automation Document Attachments
Group Broadcasting Carbon Copies Security Notification Professional firms typically follow two rules of thumb when arriving at an e- mail archive policy: 1.      Never write anything in that you wouldn’t want everyone to read in the newspaper the next day. 2.      Always archive all s and make it impossible for end-users to permanently delete them. (The archived s should be structured so that a text-retrieval program can easily search them.) FEATURES: Address Books: You should be able to organize the names, numbers, and addresses of your correspondents alphabetically in a searchable address book. Your system should also allow you to organize separate address books by subject (“Personal” and “Business,” for example.) Automation: Many packages offer some capacity for automating certain administrative tasks related to messaging, either through intelligent agents, custom scripting languages, or “wizards.” An automated in-box might incorporate a “spambuster” that searches incoming s and automatically deletes mail that was unsolicited or is from unrecognized senders, for example. The deletion, filing, and archiving of s can also be automated. Document Attachments: Your system should allow you to append the most common document types to your message (such as Microsoft Word documents, PDF files, etc.) You should check to make sure that your can support standards for attachments such as the Multipurpose Internet Mail Extension (MIME) protocol. Group Broadcasting: Your system should allow you to easily organize lists of people so that you can send a message to a large number of people simultaneously without having to re-key each address, etc. This feature allows for “one to many” transmissions, whereas a typical is a “one to one” or “one to two” transmission. Carbon Copies: This feature is similar to group broadcasting, but is used when an is being sent to just a couple of recipients. [A blind carbon copy (“bcc”) allows you to send a message to two or three recipients simultaneously without revealing to the regular recipients that the “bcc” recipient is also being sent the message. That is, the “bcc” recipient is a confidential recipient. This feature is often used to document how matters of a sensitive nature, such as disciplinary actions, were handled.] Security: Some systems allow you to encrypt your high-security messages so that you can be reasonably confident that only the intended recipient(s) will be able to read them. Such systems usually utilize a digital signature or other type of mechanism that requires that the recipient have the correct “key” to open the legible message. Notification: This feature allows the person who sends an to know when (or if) the message has been received and opened by the intended recipient. This feature allows you to more effectively deal with situations in which your intended recipient does not check their or does not take the desired action.

80 Ad Hoc and Process-Oriented Workflows
Ad hoc (or unstructured) workflows are those that give workers maximum freedom in completing their work. Process-oriented workflows (or structured workflows), on the other hand, are used to automate processes that are long-lived, repetitive, and well defined Workflow is often characterized as an assembly line, with the projects or objects to be worked upon moving from one location to another, with value being added at each stop. Workflow is composed of the rules, routes, and roles that define how the various operations are to be completed, when and where the work is to be done, and whom is to do it. ·        The rules determine what information is passed on to a given person and at what time. ·        The route defines how the data is moved, or its path. ·        The role defines specific job functions - independently of the actual people who perform the tasks. (You don’t want a task to be waiting in line for “Bob” to do it when you really need a “customer service representative” to do it.) Workflow systems also incorporate some type of contingency plan for any exceptions that may occur, and a mechanism for tracking “work in progress.” Some also provide a unified interface that places all the tools needed to complete a given “item” in the same work environment as the item. The “item” being worked upon can be broken apart into pieces, transformed, re-routed into other workflows, and/or merged back into the original item at any ‘rendezvous’ point. A good workflow solution is like a good news reporter  it is always concerned with the “who, what, when, where, and why” of everything. The dynamic nature of workflow makes it very flexible. Any process that can be diagrammed can be at least partially automated using a workflow solution. Workflow applications are normally designed using visual metaphors that are connected by lines that diagram the actual object paths. You can create sequential routes, parallel routes, circular routes, feedback loops (useful for quality control), etc. Workflow solutions are normally categorized according to the level of structure involved in the process: Ad hoc (or unstructured) workflows are those that give workers maximum freedom in completing their work. Ad hoc workflow solutions provide minimal prompting to the worker, and allow the human to do anything that the workflow software can’t (or hasn’t been programmed to) do. Ad hoc workflows are best used in situations where worker actions are highly iterative and automation needs to be incremental in nature, where human intuition or “fuzzy judgement” is needed, and when projects or sub-processes have relatively short lifecycles. The co-authoring of a book is a process that might benefit from an ad hoc workflow. Process-oriented workflows (or structured workflows), on the other hand, are used to automate processes that are long-lived, repetitive, and well defined. Any process that adheres to relatively rigid business rules and policies can benefit from a process-oriented workflow. The classical example of a process-oriented workflow is the loan approval process. In approving a loan, a lender follows a fairly rigid formula to determine creditworthiness. This could easily be automated using a process-oriented workflow solution. Similarly, insurance claims processing and tax preparation would benefit from the implementation of an automated, process-oriented workflow solution. Some would argue that workflow is not necessarily a collaborative activity and should therefore be thought of as a distinct technology, that it should stand apart from groupware. Workflow is increasingly being subsumed under the groupware banner, however, and we believe rightfully so. While it is technically true that workflow may not always involve collaboration, workflow solutions in the modern enterprise are usually collaborative in nature, and much is to be gained by viewing the technologies holistically. There are subtle differences between the two: groupware tends to be more ad hoc in nature than workflow, while workflow tends to be more process-oriented. But since most activities fall somewhere on the spectrum between ad hoc and process-oriented – and activities usually overlap anyway – we don’t believe anything is to be gained from a strict separation of workflow and groupware.

81 Benefits for Future Unified messaging systems
Computer-telephony integration (CTI) Interactive voice response (IVR.) This is a relatively new area of technology convergence, so we will probably see many new and innovative applications within the next few years as new CTI/IVR applications make their way onto the groupware playing field. Benefits for Future GroupWare is increasingly expanding into the area of telephony. The universal in-box of many groupware systems will now allow you to organize your voic s along with your , faxes, and other documents. Such applications are called unified messaging systems. You can answer and make telephone calls and faxes from within such a system. Functionality that was previously the domain of call centers and PBXs can now be easily incorporated into groupware software also. Voice messages can now be routed, screened and re-routed along with other types of media in workflow applications, for example. Text-to-voice and voice-to-text functions allow you to have your read aloud to you or allow you to dictate messages, respectively. All of these innovations have arisen out of work done in the areas of computer-telephony integration (CTI) and interactive voice response (IVR.) This is a relatively new area of technology convergence, so we will probably see many new and innovative applications within the next few years as new CTI/IVR applications make their way onto the groupware playing field.

82 Distributing Computing

83 Introduction to the Distributing Computing Concepts
The trend toward integrated decision support for the extended enterprise . Use databases on meaning rather than just structure. A focus on asset management, cellular design, self-managing applications, and collaborative commerce, to take us beyond ERP systems.

84 Distributing Computing Concept
A. Introduction to Computing Concepts In its infancy, computing was done on large centralized mainframes with terminals for user access. With the introduction of smaller computers, the landscape has changed to networking computers to allow for the sharing of resources. This sharing of resources can take the form of client/server – where one computer (the server) waits for another computer (the client) to contact it. An alternative to client/server is a peer-to-peer network in which each computer can act as a server or host. This distinction is becoming somewhat blurred as server become more of a “traffic cop”, directing requests for resources to the peer that can service needed. Such a structure promotes efficiency by allowing load balancing, improves collaboration among workers, improves reliability through replication and, among other things, reduces costs by sharing of resources. If we add Object Technology, we take one more step toward distributive computing. An object is composed of internal data and internal processes that receive data from an external interface, process the data and returns it to the external source using the common interface. By utilizing these concepts, a large problem can be attacked in small pieces that are easier to solve and program. When object technology is utilized with proper interfaces, the same code (program) can be shared by many processes or incorporated into larger programs, thus requiring a lower level of computer resources and a faster development of new processes. Strengths and Weaknesses of Object Request Brokers (ORB) An object request broker is a structure that allows the integration of objects into multiple programs and processes. Much like the sharing of resources in a peer-to-peer network, an ORB provides the means of sharing processes or programs across multiple applications and different computer types and platforms. A single set of code (an object) is written with an interface to a Common Object Request Broker. These objects can then be “called “ by other programs. This is accomplished by having the requesting program (client) provide the ORB with the data required by the object for proper processing. The ORB transfers the request, with necessary data, to the object. Once received by the object, processing is performed and the results passed back to the ORB for forwarding to the requesting program. Use of an ORB allows the client to utilize the object, without giving consideration as to where the object is located or the programming language used within the object.

85 The Internet and the World Wide Web

86            X.      The Internet and the World Wide Web
A. Historical Overview Origin of the Internet - ·        U. S. Department of Defense 1969 ·        Increased recognition and growth in ·        Designed for use by the military and researchers to communicate ·        Provided the capability to re-route in the event of any “station” on the path to delivery ·        Global in nature

87 Hypertext Markup Language (HTML)
HTML is the established standard for publishing hypertext on the WWW. The current recommendation of this non-proprietary product is version 4. This language is the basis for describing the presentation on a browser. While this is an established standard, major browsers handle certain features differently, thus requiring Web authors to develop separate pages for the applicable browser. Efforts are currently underway to develop a new standard version that will incorporate additional features.

88 Hypertext Transfer Protocol (HTTP)
Hypertext Transfer Protocol or HTTP is a set of rules for exchanging files on the WWW. This represents the application protocol for other protocols (principally TCP/IP) used for the exchange of information on the Internet The web uses addresses in the form of: The World Wide Web utilizes “Web pages” which are text, graphics, buttons & hypertext links. In order to facilitate communications between different types of machines, the WWW uses the Hypertext Transfer Protocol (HTTP). Hypertext Transfer Protocol (HTTP) Hypertext Transfer Protocol or HTTP is a set of rules for exchanging files on the WWW. This represents the application protocol for other protocols (principally TCP/IP) used for the exchange of information on the Internet

89 Common Gateway Interface (CGI)
A Common Gateway Interface (CGI) is the tool used to allow data not available on the WWW to be interactively provided to the user by having the web server relay commands to other programs. Gateways can be written in many programming languages. Some examples of CGI programs include: ·        image maps ·        search engines ·        guest books ·        password entry ·        Web site visit counters ·        advertising The process utilized in working with a CGI is: 1.    The user comes to the CGI input screen (form) by clicking on a hyperlink or through direct addressing in the user’s browser. 2.    The receiving Web server executes the commands in the CGI. The completion of this script returns the results (output) and terminates the process. 3.    The results of the request, a dynamically generated web page (HTML) are transmitted to the user and displayed by the browser.

90 Servlets Servlets Servlets are Java programs that execute inside a Web server running a Java Virtual Machine. Servlets are rapidly becoming a preferred solution for what had previously been the turf of CGI. Advantages of servlets include: 1.      Offers greater security by using all of the security features of Java. 2.      Faulty code will write an error message to the log and will not crash the server. 3.      Utilizes less server resources by starting a new thread, not a new process, when a request is made for a new servlet and by utilizing the same thread for successive requests. 4.      Are modular in nature and can direct the output from one servlet to another. This provides the ability to reuse components and shortens the development time.

91 Advanced Hypertext Markup Specifications
The language has three important functions, to provide direction as to: What markup codes are allowed What markup codes are required How the codes will be recognized as not being part of the basic text Standard Generalized Markup Language SGML is a language that gained status as an International Standard in To be more precise, it is a guide that formally describes a language and its uses. This system is the means of communicating how data is to be interpreted. A system, like SGML, provides the structure of how a document should be encoded (marked up) to allow other programs to interpret raw text or other data. For example, certain codes indicate page headings, paragraph bodies, and so forth. The language has three important functions, to provide direction as to: 1. What markup codes are allowed 2. What markup codes are required 3. How the codes will be recognized as not being part of the basic text. One significant advantage of such a system is that it can be applied across different operating platforms. This allows data to be shared by various users in a multitude of applications and environments.

92 Extensible Markup Language (XML)
Extensible Markup Language, or XML, is a refinement or more restrictive form of SGML. While being more restrictive, it offers some advantages over HTML (also a subset of SGML), and SGML itself. Advantages of XML: 1.      XML permits the use of a Document Type Definition (DTD). This eliminates the need for a developer to choose between different flavors of HTML, which have been developed by competing software companies. Because of the many variations, HTML is really not a standard any longer. 2.      XML provides not only presentational information, but also tells you what the information is. 3.      XML is easier to work with than SGML and has more functionality than HTML. 4.      XML allows you to add codes (which HTML does not.) This gives the developer more flexibility. Adoption of uniform codes by a profession or industry would result in the ability to easily transfer data across programs or computer platforms. 5.      XML can incorporate format codes and can be viewed and edited using any text editor.

93 Cascading Style Sheets (CSS)
Cascading style sheets are groups of style sheets (groups of properties) that allow web authors define and control how HTML elements are viewed in a Web browser. HTML permits limited control in formatting documents and users have developed various, non-standard methods to address this shortcoming. CSS establishes a set of standard properties designed to allow the author to use for Web page formatting and layout. This process also allows the reuse of particular formatting throughout the document (or in other documents, if an external style sheet is used) without entering all the parameters. There are many style sheet commands, including: ·        Font and Text Definitions ·        Positioning and Division Definitions ·        Margin and Background Commands

94 E-Commerce and EDI The term E-Commerce (Electronic Commerce) brings different things to mind depending upon your prospective or the context in which it is used. EDI (Electronic Data Interchange) is a subset of E-Commerce. EDI forms the basis for information to flow between two organizations without paper (or less of it) using a predefined set of parameters The term E-Commerce (Electronic Commerce) brings different things to mind depending upon your prospective or the context in which it is used. An overview definition might be - any activity between two entities that uses data communications methodologies to accomplish their objectives. This can encompass a wide range of activities including: ·        Use of the telephone ·        Use of fax machines ·        Use of financial institutions ·        Purchase of airplane tickets ·        communications ·        Voice messaging services ·        And …………………………….. As you can see, the list of items that could (or are) included under the umbrella of E-Commerce is extremely broad. For the most part, however, it is not unusual for people to view EDI (discussed below) and E-Commerce as one and the same. Electronic Data Interchange EDI (Electronic Data Interchange) is a subset of E-Commerce. EDI forms the basis for information to flow between two organizations without paper (or less of it) using a predefined set of parameters. How does EDI attempt to accomplish this? By the use of agreed upon standards for different types of “documents” (pieces of information), the necessary data can flow from one entity to another – regardless of the compatibility of the systems used by the trading partners. Examples of this will be explored in further depth when we discuss types of e-commerce activities. For EDI to be successful as an effective E-commerce tool, standards had to be developed. Prior to the establishment of the current standards, companies developed their own protocols or parameters for transferring data with their trading partners. These proprietary efforts were primarily fueled by large corporations and “shoved down” to their suppliers. As time passed, the variations became so numerous that it was impossible for companies to satisfy their need to work with multiple trading partners. As a result of the necessity to have agreement about non-propriety guidelines, various groups developed to accomplish this task. The current standard is identified as ANSI X12 published by the American National Standards Institute. The process for establishing new standards (or making changes) includes development by an ANSI committee, a period of public exposure and review, followed by consensus of ANSI members. For companies involved in international trade, another set of standards, EDIFACT (ISO 9735), created by the United Nations Electronic Data Interchange for Administration, Commerce and Transport Committee becomes important. While efforts are underway to merge these two standards, there is some resistance by North American companies due to their investment in the ANSI X12 standards.

95 E-Commerce and EDI Functions
Virtual Stores and the Online Gold Rush Customer Relationship Management Self Service Accounting and Human Resource Applications Customers could check their accounts receivable status. Vendors could check their account payable status. Outside sales people could enter and monitor customer orders. Employees could review their employee benefit accounts – and in some cases make changes in their preferences or elections. Employees could schedule vacation or other time away from the work-site. Virtual Stores and the Online Gold Rush Currently becoming popular are “virtual stores”, a marketplace without bricks and mortar. While some of these stores are completely new (AMAZON.com), others are being established by traditional merchants to supplement (and maybe replace) their normal means of attracting and conduction sales of their products. Used in conjunction with other e-commerce applications, no inventory and little overhead is required to complete a timely delivery. Imagine  no checkout lines, no fighting traffic and the ability to shop 24 hours a day. Because of the growing interest, there has developed an “online gold rush” so as not to be left behind 2. Customer Relationship Management (Order of Golf on the NET) The maintenance of information about your customers can be a difficult and time-consuming task. However, this information is invaluable for a company to prosper. Some of the pieces needed include: ·        Who is the contact person for the customer? ·        What are the buying habits / needs of the customer? ·        Does the customer require any special documents? ·        How frequently should a representative contact you? Company contacts or visits the customer? At what level? ·        How important is this customer to your business? ·        What are the credit requirements / restrictions? While this is only a subset of the information you might desire, it’s fairly clear that for a company with more than just a few customers, some electronic process is required to accumulate and retain this data. In addition, what if different sales people from your company maintained this information in varying formats – or not at all? What would happen if that person were no longer available? Fortunately, existing software allows you to address these problems on a consistent and effective basis. In addition, these same products allow you to look at data in many different ways (size, industry, geographically etc.). In addition, if properly designed and integrated, they may allow you to utilize information from your other data systems to reduce the entry of data. In some instances, trading partners integrate their processes and use EDI and other technology to eliminate the need for human intervention to assure that the product is available as needed. This creates efficiency and lower costs for both parties. In some instances, vendors are permitted to monitor customer inventory and sales levels or production schedules and ship additional stock where and when it is needed. In other circumstances, the customer is permitted to query the supplier’s stock for availability of a needed product. Or they may be able to query the supplier’s shipping records to see when they should expect arrival of an order. While there are probably many other facets of customer relationship management that require attention, the ones described above (and probably some not mentioned) can be performed by, or provide assistance via the use of technology resources. Its important to remember, however, that proper implementation and monitoring is critical for success. In addition, trust and appropriate safeguards are crucial to an effective relationship.

96 Online Banking and Tax Payment Systems
Some of the types of transactions that might be included are: Transfer of funds between deposit accounts. Transfer of funds between deposit and loan account (and vice versa). Initiate electronic payment to vendors. Query their accounts for activity and to aid in reconciliation. Send messages to customer service representatives and receive replies. Online Banking and Tax Payment Systems Many, if not all, banks of any size have programs available that permit transactions to be completed on-line either through the Internet or via direct dialup. These programs allow the business (and individuals) obtain information about their accounts (checking, money market, loans, lines of credit etc.) and initiate transactions. Some of the types of transactions that might be included are: ·        Transfer of funds between deposit accounts. ·        Transfer of funds between deposit and loan account (and vice versa). ·        Initiate electronic payment to vendors. ·        Query their accounts for activity and to aid in reconciliation. ·        Send messages to customer service representatives and receive replies. List any other applications that could be used in the banking industry:

97 Creating a Technology Plan

98 Creating a Technology Plan
Get top management committed to the project Do research, make a plan, and create a budget and timeline Need assessment Contact vendors and gather your resources Set up the prototype and choose vendor product Establish policies and best practices Educate the employees on what is coming Roll out the hardware and software Install accounting applications Establish maintenance system, audit test Evaluate and continuous improvement XI.      Creating a Technology Plan A.                     Introduction to Methodologies for More Informed Business Decisions Your job is to introduce your company to a new accounting application. On the surface, this is easy to do. You call a few vendors, set up a few promos and load the software on to the server. In some cases it is, but in most organizations to implement new accounting software is a very difficult task. Before starting the project there are 10 steps that are worth considering. They are: q       Step 1: Get top management committed to the project q       Step 2: Do research, make a plan, and create a budget and timeline q       Step 3: Contact vendors and gather your resources q       Step 4: Set up the prototype and choose vendor product q       Step 5: Establish policies and best practices q       Step 6: Educate the employees on what is coming q       Step 7: Roll out the hardware and software q       Step 8: Install accounting applications q       Step 9: Establish maintenance system, audit test q       Step 10:Evaluate and continuous improvement LOOK IN THE BOOK

99 Projected Goals and Objectives
What problem will the new accounting system solve – Where is the pain? Top management needs to understand and endorse the project? How will you achieve that? How technically adept are the participants? Project Manager – who is that person?

100 Benefits vs. Coasts Is top management aware of the hard and soft benefits and costs of the project? What method will you use to determine the ROI of your project? Will the pilot project be able to demonstrate ROI clearly?

101 Budget for Each Implementation Phase
Break project into phases 60 days to 6 months Construct a budget for each phase of the project

102 Database and Legacy System Integration
Identify all of your company’s legacy systems and databases. Have legacy security issues been addressed? Are the candidate legacy system interfaces clearly defined?

103 Performance Issues? Have hardware and software platforms been defined?
Have you planned for different growth scenarios?

104 Server Locations, Hosting, and Maintenance:
Will you company have a dedicated server to host, or will host the content on a shared server? Who will maintain the accounting application?

105 Implementation Time and Milestones.
Have reporting requirements been fully defined by management? Are there meeting scheduled to review the findings with top management?

106 Staffing/Resources to Maintain and Support the Project
Is there a clear policy in place to resolve problems? Are there clear maintenance procedures in place? Are there clear guidelines for implementing upgrades? Have backup procedures been developed? Does an emergency plan exist for system failure? How will ongoing training be managed?

107 The Future of Accounting, Information, Technology, and Business Solutions

108 Technology Projections
Computer systems will be on-line and virtually connected. Distributing intelligence to handle screen layouts, data entry validation and other processing steps Computers sites will harbor intelligent agents. An intelligent agent is a software that waits in the background and performs an action if a specific event occurs. Monitoring systems will focus on exception reporting and will place emphasis on fund transfer systems. Often called “audit by exception” as opposed to auditing actual reporting. Viruses and Hackers will continue to be a growing concern. XII.      The Future of Accounting, Information Technology, and Business Solutions Preparing for Change Many forward thinking members of the management accounting profession is all to aware that the world in which we live is rapidly changing in profound ways. As a result, the accounting profession is in a stage of serious introspection, evaluating all facets of what it does and struggling in creating a vision of what is can and should be. One of the major challenges today is for accountants to effectively use IT to build accounting information system architecture that improves the ability of accounting to support organizations goals. To meet these challenges, the management accounting professionals need to develop a strategy and develop a conceptual understanding of IT resources, and the ability to understand new model business activities and processes. With the birth of E-business the trend is also for management accountants to be associated with more real-time business environments. The challenges facing management accountants calls for change, challenges and new opportunities. Thus, management accountants must truly understand the business world and strive to add value to the organization. This will require our profession to be associated with less clerical and bookkeeping tasks. Technology Projections q       Computer systems will be on-line and virtually connected. q       Distributing intelligence to handle screen layouts, data entry validation and other processing steps q       Computers sites will harbor intelligent agents. An intelligent agent is a software that waits in the background and performs an action if a specific event occurs. q       Monitoring systems will focus on exception reporting and will place heavy emphasis on fund transfer systems. Often called “audit by exception” as opposed to auditing actual reporting. q       Viruses and Hackers will continue to be a growing concern.

109 Future Benefits of Relational Accounting Systems
 Parallel Processing is an architecture within a single computer that performs more than one operation at the same time. The advent of parallel processing leads to dramatic performance and improvements in accounting systems. Relational Accounting refers to the use of relational databases management systems (RDBMS) wish has become the technology of choice for managing data, generated by Online Transaction Processing (OLAP). In a relational Database Accounting, environment relationships are created by comparing data, such as account number and names. Future Benefits of Relational Accounting Systems Today most accounting systems also supports some level of parallel processing, distributing processing, and replication functionality. As these technologies mature, more accounting applications will likely take advantage of their potential to add functional values. q       Parallel Processing is an architecture within a single computer that performs more than one operation at the same time. The advent of parallel processing leads to dramatic performance and improvements in accounting systems.

110 Distributing Processing
Is the distribution of multiple computers throughout an organization. This structure lets a single transaction span multiple databases by ensuring that the process completes in either all databases or none. Distributing Processing refers to the distribution of multiple computers throughout an organization. This structure lets a single transaction span multiple databases by ensuring that the process completes in either all databases or none.

111 Replication Functionality
The ability to keep distributed databases synchronized by routinely copying the entire database or subset of the database to other server in the network. Replication Functionality can be used to publish reports and other accounting data to subscribing servers for dissemination to users via or the Internet. q       Replication Functionality is the ability to keep distributed databases synchronized by routinely copying the entire database or subset of the database to other server in the network. Replication Functionality can be used to publish reports and other accounting data to subscribing servers for dissemination to users via or the Internet.

112 Object-Oriented DBMS Provides more flexibility than systems designed for relational databases. Object-oriented databases allow for many to many relationships. The ultimate goal of object accounting systems is that it should not matter which source language they were programmed in or in which computer in the network they are running on. q       Object-Oriented DBMS is provides more flexibility than systems designed for relational databases. While relational databases easily provide one-to-many and many-to-one relationships, object-oriented databases allow for many to many relationships. The ultimate goal of object accounting systems is that it should not matter which source language they were programmed in or in which computer in the network they are running on.

113 Other Trends Graphical Accounting
Refers to the use of a Graphical User Interface (GUI) to present the accounting systems functions and data users. Today practically all major accounting system vendor has released their accounting software with the same standard GUI interfaces. Graphical Accounting refers to the use of a Graphical User Interface (GUI) to present the accounting systems functions and data users. Today practically all major accounting system vendor has released their accounting software with the same standard GUI interfaces. Consequently, making it easier for users to compare packages and spot good, bad, and innovative interface designs. However, we are increasingly moving toward a world in which transactions will reach accounting systems through electronic commerce, so transaction entry will soon become a dying art. Spread Sheet Accounting is the integration of spreadsheets software with your accounting system. Most client/server accounting packages include direct transfer of data to and from spreadsheets without the need for messy data file export and imports. Adaptable Accounting refers to accounting software that can be easily adapted or customized to fit most business processes. Users often see the need for accounting software to adapt to: q       The different computing platforms used throughout the organization q       The specific terminology and data capture needs of local business units q       A variety of business rules in place by governmental bodies and regulatory entities q       Differences in the implementation of specific business processes q       The demand for add-on functionality not provided by the software vendor Image Accounting links digital images to the accounting software. For example, linking an image of an inventory item that will help users correctly identify an item and allow for users to absorb information about the item. Transaction are initiated electronically through electronic data interchange (EDI), electronic fund transfer (EFT), or the Internet, making the need to scan, store, and manipulate paper-based documents disappear. As more and more Additional Hardware and Software is required: o       Scanner software o       Indexing station for tagging images o       Image manipulation software o       Image servers Accounting refers to the integration of features into the accounting software. An embedded feature lets users: q       Compose messages to others by from within the accounting application q       Access the address book for routing messages to individuals or groups q       Send attachments via , such as spreadsheets.

114 Other Trends Spread Sheet Accounting
The integration of spreadsheets software with your accounting system. Most client/server accounting packages include direct transfer of data to and from spreadsheets without the need for messy data file export and imports.

115 Other Trends Adaptable Accounting refers to accounting software that can be easily adapted or customized to fit most business processes. Users often see the need for accounting software to adapt to: The different computing platforms used throughout the organization The specific terminology and data capture needs of local business units A variety of business rules in place by governmental bodies and regulatory entities Differences in the implementation of specific business processes The demand for add-on functionality not provided by the software vendor

116 Other Trends Adaptable Accounting refers to accounting software that can be easily adapted or customized to fit most business processes. Users often see the need for accounting software to adapt to: The different computing platforms used throughout the organization The specific terminology and data capture needs of local business units A variety of business rules in place by governmental bodies and regulatory entities Differences in the implementation of specific business processes The demand for add-on functionality not provided by the software vendor

117 Internet Accounting (Client/Browser Architecture)
Allows the users access to accounting related information from any Internet connection via platform-independent desktop Web browser. For example, initiate a transaction, participate in a transaction workflow, run a query, or request a report without having any accounting software on their own PC. Internet Accounting (Client/Browser Architecture) is the integration of accounting applications with the Internet. It allows the users access to accounting related information from any Internet connection via platform-independent desktop Web browser. From an Internet browser, users could, for example, initiate a transaction, participate in a transaction workflow, run a query, or request a report without having any accounting software on their own PC. we are increasingly moving toward a world in which transactions will reach accounting systems through electronic commerce, so transaction entry will soon become a dying art

118 Workflow Accounting Trends
The automatic routing of accounting related data to the users responsible for working on them. System setup workflow Message-based workflow Form-based workflow Transaction-based workflow Web workflow Even-driven workflow Workflow Accounting is the automatic routing of accounting related data to the users responsible for working on them. The data may be physically moved over the network or maintained is a single database with the appropriate users given access to the data. Triggers can be implemented in the system to alert managers when operations are overdue. Client/Server accounting software generally provide at least six types of workflow enabling: 1.      System setup workflow 2.      Message-based workflow 3.      Form-based workflow 4.      Transaction-based workflow 5.      Web workflow 6.      Even-driven workflow

119 Component Accounting Refers to accounting software that works together and cooperate with each other. Voice Video Images Component Accounting refers to accounting software that work together and cooperate with each other. The industry is moving towards document-centric processing in which compound document contains text, images, and video. Component accounting systems can deployed in a variety of ways. The four main layers of software functionality are: 1.      Presentation 2.      Validation 3.      Processes 4.      Database A common deployment of the four main layers of software functionality is demonstrated in the figure below:

120 OLAP Accounting OLAP is fast becoming another technology and functional differentiator between accounting systems. The OLAP functionality may be built in to the accounting suite or delivered via integration with third-party products. OLAP Accounting is about using a multi-dimensional database engines for database management, importing and converting data from heterogeneous file and database systems and providing graphical query and reporting tools that supports ad-hoc data analysis. The OLAP is fast becoming another technology and functional differentiator between accounting systems. The OLAP functionality may be built in to the accounting suite or delivered via integration with third-party products. OLAP usually include some form of drilldown capabilities shown in the figure below:

121 Liv Watson – Senior Director of Information Technology
? Questions and Answers Liv Watson – Senior Director of Information Technology


Download ppt "Accounting Systems Technology for the 21st Century"

Similar presentations


Ads by Google