Profession data analyst
Data analysts import, inspect, clean, transform, validate, model, or interpret collections of data with regard to the business goals of the company. They ensure that the data sources and repositories provide consistent and reliable data. Data analysts use different algorithms and IT tools as demanded by the situation and the current data. They might prepare reports in the form of visualisations such as graphs, charts, and dashboards.
Data analyst Jobs: Open positions
Find the job of your dreams on Talent.com, one of the largest job sites worldwide.Job postings: talent.com
- Query languages
The field of standardised computer languages for retrieval of information from a database and of documents containing the needed information.
- Business intelligence
The tools used to transform large amounts of raw data into relevant and helpful business information.
The study of statistical theory, methods and practices such as collection, organisation, analysis, interpretation and presentation of data. It deals with all aspects of data including the planning of data collection in terms of the design of surveys and experiments in order to forecast and plan work-related activities.
- Documentation types
The characteristics of internal and external documentation types aligned with the product life cycle and their specific content types.
- Resource description framework query language
The query languages such as SPARQL which are used to retrieve and manipulate data stored in Resource Description Framework format (RDF).
- Visual presentation techniques
The visual representation and interaction techniques, such as histograms, scatter plots, surface plots, tree maps and parallel coordinate plots, that can be used to present abstract numerical and non-numerical data, in order to reinforce the human understanding of this information.
- Data models
The techniques and existing systems used for structuring data elements and showing relationships between them, as well as methods for interpreting the data structures and relationships.
- Information categorisation
The process of classifying the information into categories and showing relationships between the data for some clearly defined purposes.
- Data quality assessment
The process of revealing data issues using quality indicators, measures and metrics in order to plan data cleansing and data enrichment strategies according to data quality criteria.
- Information structure
The type of infrastructure which defines the format of data: semi-structured, unstructured and structured.
- Information extraction
The techniques and methods used for eliciting and extracting information from unstructured or semi-structured digital documents and sources.
- Unstructured data
The information that is not arranged in a pre-defined manner or does not have a pre-defined data model and is difficult to understand and find patterns in without using techniques such as data mining.
- Information confidentiality
The mechanisms and regulations which allow for selective access control and guarantee that only authorised parties (people, processes, systems and devices) have access to data, the way to comply with confidential information and the risks of non-compliance.
- Data mining
The methods of artificial intelligence, machine learning, statistics and databases used to extract content from a dataset.
- Interpret current data
Analyse data gathered from sources such as market data, scientific papers, customer requirements and questionnaires which are current and up-to-date in order to assess development and innovation in areas of expertise.
- Execute analytical mathematical calculations
Apply mathematical methods and make use of calculation technologies in order to perform analyses and devise solutions to specific problems.
- Implement data quality processes
Apply quality analysis, validation and verification techniques on data to check data quality integrity.
- Collect ICT data
Gather data by designing and applying search and sampling methods.
- Define data quality criteria
Specify the criteria by which data quality is measured for business purposes, such as inconsistencies, incompleteness, usability for purpose and accuracy.
- Analyse big data
Collect and evaluate numerical data in large quantities, especially for the purpose of identifying patterns between the data.
- Apply statistical analysis techniques
Use models (descriptive or inferential statistics) and techniques (data mining or machine learning) for statistical analysis and ICT tools to analyse data, uncover correlations and forecast trends.
- Manage data
Administer all types of data resources through their lifecycle by performing data profiling, parsing, standardisation, identity resolution, cleansing, enhancement and auditing. Ensure the data is fit for purpose, using specialised ICT tools to fulfil the data quality criteria.
- Integrate ICT data
Combine data from sources to provide unified view of the set of these data.
- Establish data processes
Use ICT tools to apply mathematical, algorithmic or other data manipulation processes in order to create information.
- Perform data cleansing
Detect and correct corrupt records from data sets, ensure that the data become and remain structured according to guidelines.
- Normalise data
Reduce data to their accurate core form (normal forms) in order to achieve such results as minimisation of dependency, elimination of redundancy, increase of consistency.
- Handle data samples
Collect and select a set of data from a population by a statistical or other defined procedure.
- Perform data mining
Explore large datasets to reveal patterns using statistics, database systems or artificial intelligence and present the information in a comprehensible way.
Optional knowledge and skillsxquery information architecture mdx linq n1ql ldap report analysis results data storage manage data collection systems web analytics online analytical processing deliver visual presentation of data create data models sparql gather data for forensic purposes cloud technologies database
Source: Sisyphus ODB