Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Software Development Notes
- Study Design
- http://www.vcaa.vic.edu.au/Documents/vce/computing/ComputingSD-2016.pdf
- Data Types
- Data types are the particular forms that an item of data can take including numeric, character and boolean, and a specific type is characterised by the kind of operations that can be performed on it.
- String
- Alpha-numeric data that can be from zero to infinite size depending on available memory size and language limitations.
- Integer
- A number that is not a fraction; a whole number. Can be both positive or negative numbers for example 157 or -123,435,657.
- Boolean
- Data Type for which values can only be binary logical options such as True or False, On / Off, 1 or 0 or Yes or No.
- Currency
- Data type used exclusively for financial operations. Utilises a high-precision fractional part to prevent bad rounding of cents.
- Floating Point (Decimal or Real)
- Type that stores numbers with fractional parts, such as 1.2753
- Character (Char)
- Individual alphanumeric data type to store one single ASCII character of data usually in 2 bytes of memory.
- The Importance of Data Types:
- A data type dictates what kinds of operations can be performed on variables and by extension the degree to which that variable can be manipulated-Data types should be chosen based on what a variable must do.
- Different data types have different storage sizes and thus affect how much memory a variable uses, affecting processing power and rate of calculations involving that variable.
- Data Structures
- Array
- Special variable which can hold more than one value at a time.
- A series of objects which are the same size and type. For example you could have an array of integers or characters. Array of anything that has a defined data type.
- Array Key Points
- Each element has the same data type
- Array stored with no gaps between elements
- Arrays can have more than one dimension
- A one dimensional array is called a vector
- A two dimension array is called a matrix
- Array (Integer Index)
- Single data type storage of multiple values, where each value is referenced through an index number, usually starting with 0; index [0] being the first item stored in the array.
- Record (Field Index)
- A record is a collection of fields of the same length, possibly of different data types, typically in a sequence. Each item in the record is accessed using a field index in a more meaningful manner than the simple integer index of an array.
- Associative Array
- An abstract model for data structures composed of a collection of (key, value) pairs; the key is the identifier for a piece of data(the value). Once an associative array is within a hash function, values can be accessed by searching for their keys.
- Hash Table
- A data structure implemented to allow easy navigation of associative arrays, database indexes, etc. It does this by converting index values into unique numbers through a hash function, allowing the data associated with that index to be instantly retrieved without the requirement of searching.
- Queue
- First Item In, First Item Out
- Stack
- First Item In, Last Item Out
- File Formats
- CSV Format
- Comma Separated Value format is one of the most common file formats, comprised of a text file wherein items of data are separated by commas. CSV files provide a convenient standard for transferring information between software solutions
- XML Format
- EXtensible Markup Language is a method of encoding data in a file that is accessible and easy to read for both software solutions and people. It structures data around user-defined ‘markup’ metatags and a hierarchical layout through spacing, indents, ect.
- Key Differences between CSV and XML:
- Simpler CSV (comma separated value) data files only contain raw data. XML has field names, structure.
- XML data contains no information about how the data should be presented. The same XML data can be displayed in many different ways.
- In a CSV file the exact number and types of data need to be known in advance for a software solution to read it.
- JSON
- Javascript Object Notation is a simple data transfer format used to maintain the structure of the data without any information regarding its presentation. It is less descriptive than XML and more descriptive than CSV. It is commonly used to transfer data between web applications.
- Representing designs(Design Tools)
- Data Dictionary
- A data dictionary is a design tool which summarises what data a solution will require as well as the properties of that data. It is generally formatted as a table listing all required data for a program alongside descriptions and further metadata.
- Data dictionaries are used to:
- Design the structure of a program or database,
- Act as a reference source during development
- Assist in software maintenance and upgrade after implementation.
- Object Description
- A table that simply lists all of the necessary properties of an object, such as it’s size, position, caption, name, ect. Only the properties relevant to the object’s class would be listed and properties with satisfactory default values do not need to be included in the table.
- Mock-up
- A design tool for creating the appearance of a solution, often drawn-up representations of user interfaces, screen displays and visual outputs.
- Mock-ups generally show:
- Positions of elements
- Relative sizes
- Margins, borders
- Alignment of objects
- Colours
- Image contents
- Headings, text blocks
- Navigation controls
- Pseudocode
- A written description of a programming algorithm unbound by the strict rules and conventions of code. The main purpose of pseudocode is outlining the ideas and meaning behind an algorithm and any structure in the algorithm should generally serve to further outline the code’s logic.
- VCAA Pseudocode Rules:
- 1. The = symbol is only used for logical testing
- 2. The ← symbol is used for assignment
- 3. Code indentation is always used to show lines of code are controlled by
- selection structures or iteration structures
- 4. Keywords can be arbitrary, but they must also be consistent.
- 5. Blocks of pseudocode should start with BEGIN and finish with END.
- Add. ‘Internal Documentation’ can be included in pseudocode to explain specific aspects such as the meaning of keywords.
- Language Features
- Instruction
- A simple command directing an action the code should carry out.
- Procedure
- Also known as a subroutine, a procedure is group of instructions that together perform a specific task within the program. Procedures are often repeatedly called to by other parts of the program to carry out their task and generally do not interact with other sections of the program beyond the scope of that task. Procedures can be differentiated from functions by the fact that they do not return an output.
- The act of breaking down large blocks of code into different procedures is known as modular programming.
- Method
- A action that can be carried out on an object of a specific class, such as ‘button.press’.
- Function
- A procedure that is called upon to calculate and return output. Functions can be further differentiated from procedures by the fact that functions posses a () after their name, e.g. SORT(); within these parentheses parameters can be entered to control how it performs its calculations.
- Control Structures
- A block of code that dictates which lines of code will be executed, often based on an analysis of variables. There are three types of control structures:
- Sequence: Default control structure in which lines are read in order(from top to bottom). An example of this would be the GOTO command.
- Selection: Control structure that executes some code if a variable meets certain conditions; IF loops are an example of this.
- Iteration: A control structure that repeatedly executes(loops) a block of code for as long as a variable meets certain conditions. This can be seen in WHILE and FOR loops.
- Sorting Techniques
- Suitability
- The type of sort used should be the fastest for the data set whenever possible. More complex sort algorithms, such as Quick Sort, are much faster on larger data sets, but may be no faster or even slower on smaller data sets than a simpler algorithm, such as a Bubble Sort.
- Complexity
- Complexity of the algorithm used to perform the sort usually has an inverse relationship with sort time. The more complex the process the faster the sort performs. However, complex algorithms can be slower or no faster than simple algorithms when the number of items to be sorted is smaller.
- Sort Time
- Sort time depends on the number of items needing to be sorted and the algorithm used to sort these items. The more complex the algorithm the shorter the sort time will be, in general. Simple algorithms can be used when the number of items is small, but for larger lists, simple sorts, such as a Linear Sort, will result in longer sort times compared with more complex algorithms, such as Quick Sort.
- Selection Sort
- A selection sort examines each item in an array by finding the lowest item and swapping this lowest item with the item at the start of the array. At the end of the sort the numbers are lowest to greatest. Selection sorts are very simple, but can take a long time to sort large lists.
- Quick Sort
- Quick sorts are both efficient and fast, but far more complex than selection sorts. Quick sorts use the method recursion or they can also utilize a stack as well to perform the sort. The sort selects a pivot value from the list and then partitions all other items around the pivot; items lower to the pivot’s left and and items higher to the right. These sublists can then be recursively sorted with new pivot values.
- Search Techniques
- Linear Search
- The only way to search unsorted data and the simplest, easiest search algorithm to write, a linear search checks each item in a data set(through comparison with the search item) until it finds the desired item. As the maximum amount of comparisons is always equal to the amount of items in the set, a linear search works best for small sets but will be highly inefficient for larger sets of data.
- Binary Search
- A binary search is an efficient and swift way of searching through a dataset, but requires such lists to be sorted first. The search operates by dividing a list in two and deciding which partition will contain the search object. This process is then repeated with the half containing the object until the search item is located.
- Data Validation
- Existence Check
- A check if the user has inputted data.
- Range Check
- A check if the inputted data is within the range of acceptable values.
- Type Check
- A check if the input is of the correct data type.
- Testing
- A process of testing a solution to determine if it functions correctly, meeting all user needs and fulfilling both functional and nonfunctional requirements. Testing is generally done during or after development and is not the same as evaluation.
- Trace Table
- A tool for testing the behaviour of algorithms in pseudocode through step-by-step, line-by-line manual calculations to ensure expected output is obtained. This process can be represented in table format to display the logical progression of each line and to ensure all pathways within the algorithm are shown and that all variables and constants are represented.
- Test Data
- Data that is designed to test the parameters of what a solution considers ‘acceptable input’ to make sure its validation functions correctly and that its algorithms can correctly process all desired inputs.
- Documenting Test Results
- Documenting test results is important in order to obtain proof that tests have been carried out and to create references for future tests and development. A good way to record testing data is in a testing table, which records the specific tests carried out how successful the tests were and what measures were taken to correct any errors that were found.
- Internal Documentation
- Unexecutable lines in a program’s code left by developers that serve as notes on methods and syntax. They are intended to be read by programmers, not users. Internal documentation increases maintainability of code as it can explain how particular sections of code operate to new programmers and assist with the memory of programmers returning to the solution.
- Purpose of Internal Documentation
- Provides a simple description, describing what is occurring and includes information not necessarily easily understood by looking at the code itself. It can assist the creator of a solution at a later date to remind them of what the code does and assists future programmers who wish to modify the software solution.
- Characteristics of Internal Documentation
- Describe the functions of key variables and procedures
- Explanation of the naming and coding conventions
- References to code that have been sourced
- Revision information
- Contributions
- Does not have an effect on efficiency, compilation ignores comment lines
- Naming Conventions
- A naming convention is the designation of elements of a program with names that are easily understood, short and convey meaning about that element’s purpose. Such conventions should be applied to the naming of all variables and subroutines within a solution. These can aid the programmer in remembering the names of functions, variables, and other objects, ensure all element names are compatible with the programming language used, assist new programmers in understanding the solution and can help with future maintenance of the solution. Examples of naming conventions are the Hungarian Notation(three letters at start of element name indicate that element’s class), CamelCase(Capitalise words and remove spaces for names) and Python’s PEP 8 document.
- ANALYSIS
- Collecting Data / Determining Needs
- Evaluating an existing or new information system or helps to determine requirements for a new solution.
- The efficiency and effectiveness of a data collection method should be considered before applying it.
- Survey
- Questionnaire distributed among stakeholders. Containing mostly tick the box style questions along with sections for more detailed comments.
- Focus mainly on Quantitative data.
- Cheap and time-efficient way to gather data from many stakeholders.
- Unobtrusive and easy to fill out.
- Unreliable; subjects may not answer every question.
- Rigid structure makes it impossible to gather data beyond the set survey questions.
- Interview/Focus Group
- One on one interviews conducted on specific stakeholders within an information system.
- Personal nature of interviews allows for a flexible line of questioning that cannot be provided by methods such as surveys.
- Interviews can generate large amounts of qualitative data.
- Time consuming to organise and carry out.
- Expensive to set up.
- Intrusive; interview subjects may not always be cooperative or tell complete truth.
- Observation
- Viewing the workings of an information system and its various parts(including people) function naturally and gathering notes on what is observed.
- Detailed, reliable and extensive qualitative data.
- Unlike interviews or surveys, does not rely on secondhand accounts; reflects what really happens rather than what people say happens.
- Can reveal facts that interview or survey questions may not anticipate.
- Time consuming.
- Very intrusive and may irritate stakeholders.
- Features of Functional and Nonfunctional Requirements
- Functional Requirements
- A functional requirement is a task that a software solution must be able to do
- Non-Functional Requirements
- A non-functional requirement is a quality or characteristic a software solution must possess, such as ease-of-use, maintainability, correct language, etc.
- Solution Constraints
- Constraints are limits on developer freedom in designing a solution and conditions that affect the operation of a solution.
- Economic
- Economic constraints include the amount of resources available for the development of the solution, the amount of resources available for the maintenance of a solution and the price the final solution may be sold to customers for.
- Legal
- Legal constraints are the laws which govern the development and operation of a solution. In Australia, laws such as the 1968 copyright act prevent you from stealing other’s code for development while the privacy act 1988 prevents you from creating a solution that secretly harvests personal information from customers.
- Social
- Social constraints are restrictions based around the beliefs, attitudes and values held by wider society. They include restrictions on violence and sexuality depicted by media, taboos around religious imagery and controls on other offensive material that solutions could involve. Additionally, solutions may be expected to protect users from harassment or scamming.
- Technical
- Technical constraints generally concern the hardware and software a solution must operate with. Some solutions may be expected to run on certain devices(such as company-issued laptops) or be compatible with certain pieces of software. Solutions based on mobile devices should consider screen size, resolution and available processing power. Solutions may also be expected to follow technical conventions, such as depicting onscreen keyboards in the QWERTY format.
- Useability
- Solutions must accommodate the needs of the user in order to ensure their continued use. This can be done through minimising the amount of learning needed to understand a solution, incorporating systems to account for user errors(e.g. an undo button) and through constructing the solution with the user’s needs and characteristics in mind, e.g. using simple language in a UI designed for children.
- Solution Scope
- The scope defines what the solution can and cannot do as well the system’s boundaries and the extent of user interaction. Fundamentally, it identifies the responsibilities and parameters within which the solution must operate, however often in the course of a project these boundaries are fluid and liable to change as the project progresses.
- Factors
- The scope of a solution is always directly influence by the solution’s constraints and in almost all cases the scope will fall within the boundaries of the constraints. The scope may additionally be affected by:
- The client’s needs
- Needs and attitudes of the target user base
- Results of data collection
- Overall project goals
- Development team needs and desires
- Scope Creep/Feature Creep
- Though the scope should not be absolutely rigid, too lose a scope can lead to ‘feature creep’, in which the Scope is continually expanded and new features are introduced by an enthusiastic developer only to have the project become bloated and unmanageable by all the extra work that must now be done to meet this new scope.
- Software Requirements Specification (SRS)
- Features
- The SRS document provides a complete outline of a solution’s functional and non-functional requirements, constraints and scope, bringing together all data from analysis. It additionally can contain design tools such as mock-ups, use case diagrams and data flow diagrams.
- Purpose
- To comprehensively compile all of the analysis phase into one document and provide a complete breakdown of a solution into component parts, providing input to the designing stage and serving as a useful reference for all other parts of the Problem-Solving Methodology.
- DESIGN
- Generating Design Ideas
- Design ideas are rough, basic outlines of strategies to solve problems; they are almost never fully-formed workable ideas but generally methods of approaching issues. Several design ideas should always be proposed and then judged using criteria such as development speed and cost, ease of development, functionality. Ideas that pass this criteria can be integrated into the final product.
- Techniques
- Brainstorming spontaneously produce many rough concepts and ideas without fear of rejection or ridicule.
- Mind maps-Free form documents in which ideas are proposed and then linked to other ideas.
- PMI(Plus/Minus/Interesting) diagrams that break down big ideas into smaller parts for consideration.
- Evaluating Alternative Design Ideas
- In the evaluation of alternative design ideas, it is important to consider the constraints and scope of a solution, as well as client and user needs.
- Criteria
- When evaluating a design idea against other design ideas, consider the following:
- Ease of implementation
- Advantages/disadvantages over other ideas
- It’s relation to functional and nonfunctional requirements of a solution
- It’s resource and time cost compared to other proposals.
- Efficiency
- The measure of how much time, cost and effort is applied to achieve the intended result
- Measures of efficiency in a solution:
- Speed of processing
- Functionality:
- Cost of file manipulation
- Effectiveness
- The measure of how well a solution achieves its intended results
- Measures of effectiveness in a solution:
- Completeness: Accuracy of the inputted data given the appropriate input data
- Readability: The quality of the outputted data
- Attractiveness: Appeal of the user interface
- Clarity: The quality of the outputted data
- Accuracy: How precise the outputted data is
- Accessibility: How usable the software is
- Timeliness: The usefulness of the data depending on how long processing time was
- Communication of message: Clarity/ understanding of the produced data
- Relevance: How necessary the data is
- Usability: Ease of use of the program
- Measure of effectiveness of an information management strategy
- Integrity of data
- Security
- Ease of retrieval and currency of files
- Measure of effective networks
- Reliability
- maintainability
- Design Tools: Depicting interfaces between Solutions, Users & Networks
- UML
- Unified Modelling Language(UML) is a standardised modelling language that ecompasses a large variety of diagrams useful for representing aspects of a software solution. Each UML diagram is strictly defined allowing them to be easily understood and comparisons and references simple to create.
- Use Case Diagram (UCD)
- A diagram that depicts the functional aspects of a system, including the system’s goals and how people and organisations interact with the system to achieve their goals. UCDs are useful in many parts of the Problem Solving Methodology, being useful for analysis as well as training.
- SYSTEM BOUNDARY
- Symbol: Rectangle drawn around all use cases
- A representation of the confines of the system, denoting what is within(use cases) and without(actors)
- ACTOR
- Symbol: Human Figure
- Denotes an external entity, often a person or organisation, that interacts with the system. Connect exclusively to use cases.
- USE CASE
- Symbol: Ellipse with function written inside
- Represents a function within a solution
- ASSOCIATION
- Symbol: Line between actors/use cases
- Represents an interaction that can occur between a use case and an actor(s). Use cases and actors can have many different associations, but an actor should never have an association linking it to another actor.
- INCLUSION
- Symbol: Dotted arrow with ‘<<Includes>>’ written in
- Reflects links between use cases, indicating that the functionality of a use case can be utilised by another use case.
- EXTENDS
- Symbol: Dotted arrow with ‘<<extends>>’ written in with a condition ‘{}’
- Denotes the functionality of a use case contributing or enhancing another use case under specific circumstances. Often conditional, such as an actor being an administrator with special privileges(in which case the condition becomes ‘{is administrator}’
- Data Flow Diagram (DFD)
- DFDs serve as a diagrammatic representation of how data moves through a system and how it is modified by that system’s processes step-by-step, as well as how data is stored and how it is used.
- ENTITY
- Symbol: Small rectangle with label
- A representation of a person, organisation or agent outside the system that either provide data or receive data. Data cannot flow between entities and entities can only represent what is outside the system.
- DATA FLOW
- Symbol: Arrow with label
- Denotes data moving between entities, processes and data stores along with a label showing what the moving data actually is. Data flows must always lead somewhere; there can be no ‘arrows to nowhere’
- DATA STORE
- Symbol: Label within two vertical black lines
- Represents a storage location within a system and is labelled with the name of the file or location. A data store only holds data, not process or modify it. Additionally, data cannot flow between data stores directly, nor can data flow directly between entities and data stores.
- PROCESS
- Symbol: Numbered circle with a label
- Denotes an activity that receives data and modifies or transforms it in some way. Processes are numbered in the order that they occur and contain a description of the task they perform. A process must always produce some data when it receives a data flow and in most cases data should be transformed in some way by a process or additional data flows should be created.
- Context Diagram
- A unique type of DFD that focuses exclusively on an organisation’s interactions with external actors that supply or receive data from the organisation; it contains no information on the internal processes of an organisation. They can also be known as ‘level 0 DFDs’
- A context diagram contains ALL THE FEATURES OF A STANDARD DFD, except all processes are replaced by:
- ORGANISATION
- Symbol: Labelled large circle
- A representation of the organisation itself within the context diagram. It represents all the inner processes of the organisation’s systems and is behaves like a regular DFD process, however often sends and receives many more data flows.
- Solution Design Considerations
- Usability
- A measure of user-friendliness, whether or not a piece of software is clear in its use or intuitive in design etc.
- Reliability
- A measure of how long a piece of software can operate in its environment.
- Portability
- The ability for a piece of software to be cross compatible, how easy it is for it to be migrated to a different environment, i.e. different operating system, coding language etc.
- Robustness
- A measure of a software’s ability to withstand bad input or bad data.
- Maintainability
- A measure of how easy a piece of software is to be edited or reviewed, this is based on organisation of code and notes left by a developer.
- Affordability
- A measure of resources a program may take to develop in the form of money and subsequently time.
- Security
- A measure of a solutions resilience to various deliberate threats, including DDOS attacks, physical theft of data or remote theft of data through hacking.
- Interoperability
- A solutions ability to work with other systems or solutions with little effort from the user.
- Marketability
- The measure of a solutions qualities that would allow it to be sold amongst a wide audience.
- DEVELOPMENT
- Project Management
- Gantt Chart
- A project management tool that displays the tasks that need to be completed in a project and visualises the project timeline, the time needed for each task, the dependencies between certain tasks, the project’s critical path and milestones in the development timeline.
- Milestone
- A point in a project’s development(typically visualised as a task of zero duration on project management tools) that marks significant points in the project timeline.
- Dependency
- A dependency refers to a task in a project that cannot begin unless another task earlier in the project’s timeline is completed beforehand.
- Resource
- Project resources include personnel, hardware, software as well as a variety of other items such as hiring venues for consultations. All project resources are only available in finite amounts and use of resources must be planned carefully.
- Critical Path
- In project management, the critical path is the sequence of tasks of the longest duration, thus representing the minimum time it will take for the task to be completed without delay.
- Gantt Chart Creation Process
- 1.Task Identification
- The process of identifying and then listing every task in a project, ensuring they are all within the scope.
- 2.Sequencing
- Sequencing refers to the process of ordering of the project, including the identification of dependencies and milestones.
- 3.Time Allocation
- Deciding how much times each task will take and thereby creating the critical path and lag/lead times.
- 4.Resource Assignation
- Deciding upon the allocation of resources to each task.
- Recording Project Progress
- Projects almost never progress smoothly as planned without variations in resources or schedules and thus it is important to monitor and document changes to the plan in order to accommodate these changes.
- Annotation
- Project managers should annotate their planning documents as tasks change or additions are catered for.
- Task Adjustment
- Task adjustment is the modification of the dependencies, time , resources,(etc.) that are assigned to a task in response to the current conditions of the project.
- Timeframe Adjustment
- Timeframe adjustment refers to the modification of overall timeframes within the project and additional factors such as resource bookings and testing times in response to the adjustment of earlier tasks.
- Log
- A log is a record, written by the project manager, of all changes made to a project and the impact of these changes. It is useful for filling in project workers on changes made to the project and for reporting changes in the plan to superiors.
- Project Plan Effectiveness
- Evaluation Strategies for project plans & solutions
- Efficiency
- Efficiency is the measure of the time, cost and effort required in the planning and execution of a project.
- Effectiveness
- Effectiveness is the measure of how well a project plan fulfills its role in guiding the progress of a project.
- Security
- Data Protection
- Legislation such as the 1988 Privacy act mandates that software solutions must protect the data they store, both through physical means and digital. Physical security includes barrier techniques(locks, doors, alarms) and biometrics(Fingerprint scanners, photographic IDs), while digital techniques include encryption through protocols such as TLS(Transport Layer Security, which is packaged with Secure Sockets Layer and encrypts files sent over the web) as well as HTTPS(A protocol for secure information transfer over the internet)
- Authentication
- Solutions must ensure that those accessing stored data have the necessary levels of authority by authenticating those who attempt access. This can be achieved through systems like password protection, SSO(single sign-on) and access restrictions(levels of access)
- Application Architecture
- The process of identifying the components, and their interrelationships, of a software solution that meets all technical requirements while optimising common quality attributes such as performance and security.
- Mobile
- Mobile device design can be challenging due to the varying specifications of the devices you are developing for. Factors such as screen size, available memory and storage space can vary between different phones. To accommodate different devices mobile application architecture can be constructed as either ‘thin layer’ or ‘rich client’.
- Rich Client
- A ‘rich client’ solution perform the majority of its processing on the mobile system the solution is based on. While such design can require the client to own fairly powerful mobile devices, it does not need the developer to invest in back-end servers and security infrastructure to support the solution.
- Thin Client
- A ‘thin client’ mobile solution delegates most of its processing to an external, developer-owned server rather than the actual device the solution is operating on. While such design is accommodating of all kinds of devices, it requires the developers to maintain comprehensive server and security infrastructure.
- Peer-to-peer
- ‘P2P’ is a distributed application architecture in which tasks or work are distributed between peers of equal standing. Peers will generally manage or share their own connections without the need of a server(thin layer).
- Internet
- Internet-based applications are almost completely device independent, being instead based on browsers. However such applications place server maintenance and security at the forefront of priorities.
- Goals/Objectives
- Organisational
- Organisation Goals: The long term plan and why the organisation exists
- Organisation Objectives: The short term goals of the the organisation and is measurable
- Mission Statement: Broad statement about an organization's goals
- Information System
- System goals: What each system in an organisation is aiming to achieve. System goals generally align with organisational goals in some manner but are much more specific to that particular system.(If an org goal is ‘improve efficiency’, a system’s goal may be concerned with making that system more efficient.)
- System Objectives: Quantifiable, specific targets that a system can reach in order to take steps towards overall system goals. Are often numerically based(like ‘perform task A 5% faster’).
- Laws
- Privacy Act 1988
- Laws that set out the ways in which an organisation can collect, use or distribute personal data.
- Collection: Organisations should only collect personal information that is necessary for one or more of its functions
- Use and Disclosure: Organisation must not use or disclose information about an individual for any secondary purpose other than its original purpose for its collection
- Data Quality: Organisation must ensure personal information it collects, uses or discloses is accurate, complete, and up to date
- Data Security: Organisation must ensure personal information collected is protected from misuse
- Openness: Organisation must clearly express its policy on its management of its personal information
- Access and Correction: Organisation must provide access to the individual own personal information
- Identifiers: Organisation cannot use a unique identifier for an individual instead of another organisation’s identifier
- Anonymity: Organisations must allow individuals the option of not identifying themselves
- Transborder data flow: Organisations must not transfer personal information about an individual to a foreign country without consent
- Sensitive Information: Organisations must not collect sensitive information about an individual unless consent is given by the individual, or law requires collection
- Privacy and Data Protection Act 2014
- APP 1: Open and transparent management of personal information
- APP 2: Anonymity and pseudonymity
- APP 3: Collection of solicited personal information
- APP 4: Dealing with unsolicited personal information
- APP 5: Notification of the collection of personal information
- APP 6: Use and disclosure of personal information
- APP 7: Direct marketing
- APP 8: Cross-border disclosures
- APP 9: Adopted, use or disclosure of government related identifiers
- Copyright Act 1968
- In Australia, if you are the creator of a work, you own that work automatically.
- Owners have the right to:
- Choose when and how their work will be distributed
- Incorporate technological protections to guard their work
- Take legal action against those who use your work without permission
- The only exceptions to this law are works of review, research, parody or reporting. Additionally, 10% of a reference book can be legally copied without permission.
- OSS(Open source Software) and CC(Creative Commons) are licenses that allow work to be distributed with certain conditions(For example, a work protected by CC may not be used commercially)
- Spam Act 2003
- The Spam Act is designed to regulate the sending of unsolicited emails, texts, SMS, etc. It does not include voice communications or physical mail. It only takes a single unsolicited email to be considered spam.
- The act mandates:
- Emails and messages should only be sent to those that have consented to receive them.
- Emails and messages should clearly identify their sender
- Emails and messages must allow recipients to remove themselves from mailing lists and request not more transmissions.
- Additionally, organisations are not allowed to partake in ‘address harvesting’ and the acquisition of customer addresses without the customer’s knowledge.
- Charter of Human Rights and Responsibilities Act 2006
- A protection for basic freedoms and rights of all Australian, including freedom from forced labour and freedom of religion. Sections 13, 14 and 15 are relevant to SD.
- 13: A person has a right to his or her privacy and the right not to have their lives unlawfully or arbitrarily interfered with.
- 14: A person has the right to freedom of thought, conscience, religion and belief and are not limited in the ways they can express these beliefs.
- 15: A person has a right to freedom of expression provided this expression respects the rights and reputation of others and is in accordance with national security, order and public morality.
- File Access
- File size
- The size of the data that needs to be stored is an important consideration when choosing storage medium and file type.
- 1000 bytes = 1 KB
- 1000 KB = 1 MB
- 1000 MB = 1 GB
- 1000 GB = 1 TB
- Storage medium
- A storage medium is any technology used to place, store and eventually retrieve data. Examples of storage mediums include CD-ROMs, USB drives, portable ZIP drives and file servers.
- Device memory utilises primary storage(RAM) and secondary storage(hard drive). Primary storage is a high-speed medium that can quickly be accessed to store temporary data for an application. Meanwhile, secondary storage is slower to access and should house permanent data that is only accessed occasionally such as files.
- File organisation
- Serial File
- A file containing sets of data of the same type stored in the order that it was entered. Searching a serial file requires a sequential comparison of all stored items. They are simple and fast to create but slow and awkward to use later on.
- Random Access File
- A random access file stores records of the same defined length with a strict, predictable structure; as a result of this you can instantly find and retrieve desired records from a random access file. This makes them fast and convenient to use but the constraints on record length may lead to either wasted storage space or cut off data.
- File Management
- Security
- Archiving
- Removing old data which is used infrequently by transferring it to a external storage such as a flash drive, tape, disc or external HDD and thereby freeing up space; the old data must not remain in its original location.
- Backup
- Copying all files to a secure location as a recovery for a system failure. Unlike archiving, backups should not move the original files from their location.
- Backup Types
- Full Backup
- A backup of all files in a backup set. It requires a large amount of storage space and takes the longest amount of time to prepare, however it is also the fastest to restore as it only requires the most recent full backup files. Additionally, it can be inefficient as all files are backed up(even if they haven’t been changed since that last full backup) and holding several full backups in a files system takes up a large amount of storage space.
- Differential Backup
- A backup of all files that have been modified or added since the last full backup. Differential backups are fairly efficient and relatively fast to perform, although restoring the backup requires both the last full backup and the last differential backup. The amount of storage space required for differential backups varies and duplicates can take up large amounts of space.
- Incremental Backup
- Incremental backups are backups of all the files added since the last full backup, differential backup and incremental backup. They are the fastest backup to perform and generally require the lowest amount of space however they are the slowest to restore as the last full backup, differential backup and all recent incremental backups are required to fully restore all data.
- Disposal
- Characteristics of Data
- Accuracy
- The quality of the data from the inputted source
- Timeliness
- How relevant the data is depending on the speed of processing and the data type
- Reasonableness
- How sensible the data is depending on the software uses
- Authenticity
- How true and legitimate the data is
- Correctness
- Data Management Practices
- Data Mining
- Data mining is the process by which as much useful information is gleaned from a dataset as possible in order to draw deeper conclusions from the data. However data must be obtained from a variety of sources, not all of which can be reliable and as such the larger the data set the more unreliable the conclusion.
- Effect on Stakeholders of Information Systems
- Advantages
- Disadvantages
- Data Integrity Loss
- How
- Data integrity can be compromised by a number of different threats: malware(such as trojans), accidents(such as forgetting to backup) and event-based threats outside of operator control(such as a power failure). Additionally, poor system data management can lead to inefficient systems that regularly lose data and can be a result of poor organisation, a lack of standards or conflicts between systems.
- Impact
- The loss or other compromise of a system’s data can impair the standard functions of the system and ultimately cost the organisation operating the system both time and money.
- Networks
- Wired
- A network in which all devices are connected by a networking cable such as CAT6 or RJ-45. Usually used in an office to connect all towers and servers
- Wireless
- A network where devices are connected through Wi-Fi or Bluetooth to each other. A server is usually connected to a WI-Fi Router which then connects to the devices. Commonly used for Mobile Devices. Many Companies make use of both Wired and Wireless Networks.
- Intranet
- An form of Internal Internet commonly protected by a firewall for the outside internet. Used in a School, Office.
- Internet
- Virtual Private Networks (VPN)
- A private network setup across public network infrastructure which uses encryption to secure data that is sent across the network.
- Threats
- Accidental
- Accidental threats are threats performed without malicious intent. It can be caused by human error or by an employee being tricked and manipulated.
- Deliberate
- Deliberate threats are threats performed with malicious intent. They can include hacking, an individual cracker or a criminal organisation.
- Events-based
- Are non malicious threats are caused by events occurring that are outside the control of the users such as natural disasters, computer failures and software failures.
- Physical & Software Control of Data
- Security of data
- Communication of data
- Sharing data between Information Systems
- Role of:
- Hardware
- Software
- Technical Protocols
- TCP/IP
- Transmission Control Protocol/Internet Protocol is the basic communication protocol suite of the internet(though it can also be used by private networks).
- The transmission control protocol manages the assembly of data into packets that can be sent as well as the disassembly of received packets back into data.
- The Internet Protocol ensures that sent data goes to the right destination by directing towards a location based on its specific IP address.
- Tracing User Activity
- By tracking user activity and monitoring behaviour, organizations are able to optimize applications for their user base and thereby improve user satisfaction. Additionally, monitoring user activity can protect solutions from the actions of potentially hostile users.
- Tools
- UAM-User activity monitoring software which can use visual forensics, user activity alerting and user behaviour analytics to log and monitor user activities. UAM is primarily utilised for protection.
- Keyloggers can monitor user keyboard inputs.
- Techniques
- In-building surveys and prompts into the solution can be an easy way for solutions to be updated on user thoughts and actions without violating user privacy.
- Problem-solving Methodology
- Analysis - Analysing the problem.
- Solution Requirements
- What output is the solution to provide? What data is needed to produce the output? What Functional requirements are needed for the solution. What is the Solution required to do, and what non-functional requirements should it have.
- Solution Constraints
- What conditions need to be considered when designing the solution? Eg. Economic, technical, social, legal, usability.
- Scope of Solution
- The scope states the boundaries or parameters of the solution. It identifies the area of interest or what aspects of the problem will and will not be addressed by the solution.
- Design - How the Program will operate and achieve solutions required
- Solution Design
- PLanning how the solution will function and its appearance. The Solution Design typically involves identifying what specific data is required and how the data will be named, structured, validated and manipulated.
- Evaluation Criteria
- What measures will be used to judge whether or not the solution meets the requirements? These criteria should arise from the solution requirements identified in the analysis stage.
- Development
- Manipulation (Coding)
- Electronically “Building’ or creating the solution to the design that was created. However it may warrant modification to the original design in order to create a working system.
- Validation
- Checking the reasonableness of the data being input. Can be both manual and electronic. Proofreading is a manual technique when a human scans the data for errors. Electronic validation is when the validation is built into the system.
- Testing
- Testing whether the the solution does what in was intended to do:
- Establishing what tests to use
- Determining what data will be used
- Determining expected results
- Conducting the test
- Recording actual results and comparing to the expected results
- Correcting any identified errors.
- Documentation
- Writing any internal and user documentation, including those within the UI (User Interface), to support the functioning and use of the solution
- EVALUATION
- Strategy
- The creation and implementation of strategies designed to determine to what extent the software solution met its requirements. This includes the identification of the data that needs to be collected based on criteria from the design phase, the creation of a timeline specifying the data collection period and the actual use of various methods and techniques for collecting data(interviews, surveys, etc.).
- Report
- The creation of a report that details the extent to which the final solution met the criteria of the user. It should be based on criteria outlined in the design stage of the PSM.
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement