Profession data scientist

Data scientists find and interpret rich data sources, manage large amounts of data, merge data sources, ensure consistency of data-sets, and create visualisations to aid in understanding data. They build mathematical models using data, present and communicate data insights and findings to specialists and scientists in their team and if required, to a non-expert audience, and recommend ways to apply the data.

Data scientist Jobs: Open positions

Find the job of your dreams on Talent.com, one of the largest job sites worldwide.

Job postings: talent.com

Personality Type

Knowledge

  • Information categorisation

    The process of classifying the information into categories and showing relationships between the data for some clearly defined purposes.

  • Visual presentation techniques

    The visual representation and interaction techniques, such as histograms, scatter plots, surface plots, tree maps and parallel coordinate plots, that can be used to present abstract numerical and non-numerical data, in order to reinforce the human understanding of this information.

  • Resource description framework query language

    The query languages such as SPARQL which are used to retrieve and manipulate data stored in Resource Description Framework format (RDF).

  • Information extraction

    The techniques and methods used for eliciting and extracting information from unstructured or semi-structured digital documents and sources.

  • Statistics

    The study of statistical theory, methods and practices such as collection, organisation, analysis, interpretation and presentation of data. It deals with all aspects of data including the planning of data collection in terms of the design of surveys and experiments in order to forecast and plan work-related activities.

  • Data models

    The techniques and existing systems used for structuring data elements and showing relationships between them, as well as methods for interpreting the data structures and relationships.

  • Data mining

    The methods of artificial intelligence, machine learning, statistics and databases used to extract content from a dataset.

  • Online analytical processing

    The online tools which analyse, aggregate and present multi-dimensional data enabling users to interactively and selectively extract and view data from specific points of view.

  • Query languages

    The field of standardised computer languages for retrieval of information from a database and of documents containing the needed information.

Skills

  • Develop data processing applications

    Create a customised software for processing data by selecting and using the appropriate computer programming language in order for an ICT system to produce demanded output based on expected input.

  • Handle data samples

    Collect and select a set of data from a population by a statistical or other defined procedure.

  • Execute analytical mathematical calculations

    Apply mathematical methods and make use of calculation technologies in order to perform analyses and devise solutions to specific problems.

  • Collect ICT data

    Gather data by designing and applying search and sampling methods.

  • Deliver visual presentation of data

    Create visual representations of data such as charts or diagrams for easier understanding.

  • Design database scheme

    Draft a database scheme by following the Relational Database Management System (RDBMS) rules in order to create a logically arranged group of objects such as tables, columns and processes.

  • Report analysis results

    Produce research documents or give presentations to report the results of a conducted research and analysis project, indicating the analysis procedures and methods which led to the results, as well as potential interpretations of the results.

  • Normalise data

    Reduce data to their accurate core form (normal forms) in order to achieve such results as minimisation of dependency, elimination of redundancy, increase of consistency.

  • Build recommender systems

    Construct recommendation systems based on large data sets using programming languages or computer tools to create a subclass of information filtering system that seeks to predict the rating or preference a user gives to an item.

  • Interpret current data

    Analyse data gathered from sources such as market data, scientific papers, customer requirements and questionnaires which are current and up-to-date in order to assess development and innovation in areas of expertise.

  • Perform data cleansing

    Detect and correct corrupt records from data sets, ensure that the data become and remain structured according to guidelines.

  • Establish data processes

    Use ICT tools to apply mathematical, algorithmic or other data manipulation processes in order to create information.

  • Implement data quality processes

    Apply quality analysis, validation and verification techniques on data to check data quality integrity.

  • Manage data collection systems

    Develop and manage methods and strategies used to maximise data quality and statistical efficiency in the collection of data, in order to ensure the gathered data are optimised for further processing.

Optional knowledge and skills

unstructured data xquery perform data mining create data models ldap integrate ict data sparql mdx manage ict data classification n1ql manage data manage ict data architecture data quality assessment define data quality criteria business intelligence linq

Source: Sisyphus ODB