What are the benefits of using version control?
The primary benefits of version control are –
- Enables comparing files, identifying differences, and merging the changes
- Allows keeping track of application builds by identifying which version is under development, QA, and production
Helps to improve the collaborative work culture - Keeps different versions and variants of code files secure
- Allows seeing the changes made in the file’s content
- Keeps a complete history of the project files in case of central server breakdown
Can you sort multiple columns at one time?
Multiple sorting refers to the sorting of a column and then sorting the other column by keeping the first column intact. In Excel, you can definitely sort multiple columns at a one time.
To do multiple sorting, you need to use the Sort Dialog Box. Now, to get this, you can select the data that you want to sort and then click on the Data Tab. After that, click on the Sort icon.
In this Dialog box, you can specify the details for one column, and then sort to another column, by clicking on the Add Level button.
What are some Python libraries used in Data Analysis?
Some of the vital Python libraries used in Data Analysis include –
- Bokeh
- Matplotlib
- NumPy
- Pandas
- SciKit
- SciPy
- Seaborn
- TensorFlow
- Keras
What are the important steps in the data validation process?
As the name suggests Data Validation is the process of validating data. This step mainly has two processes involved in it. These are Data Screening and Data Verification.
- Data Screening: Different kinds of algorithms are used in this step to screen the entire data to find out any inaccurate values.
- Data Verification: Each and every suspected value is evaluated on various use-cases, and then a final decision is taken on whether the value has to be included in the data or not.
What are the key requirements for becoming a Data Analyst?
This data analyst interview question tests your knowledge about the required skill set to become a data scientist.
To become a data analyst, you need to:
- Be well-versed with programming languages (XML, Javascript, or ETL frameworks), databases (SQL, SQLite, Db2, etc.), and also have extensive knowledge on reporting packages (Business Objects).
- Be able to analyze, organize, collect and disseminate Big Data efficiently.
- You must have substantial technical knowledge in fields like database design, data mining, and segmentation techniques.
- Have a sound knowledge of statistical packages for analyzing massive datasets such as SAS, Excel, and SPSS, to name a few.
What do you think are the criteria to say whether a developed data model is good or not?
Well, the answer to this question may vary from person to person. But below are a few criteria which I think are a must to be considered to decide whether a developed data model is good or not:
- A model developed for the dataset should have predictable performance. This is required to predict the future.
- A model is said to be a good model if it can easily adapt to changes according to business requirements.
- If the data gets changed, the model should be able to scale according to the data.
- The model developed should also be able to easily consumed by the clients for actionable and profitable results.
Explain what is Map Reduce?
Map-reduce is a framework to process large data sets, splitting them into subsets, processing each subset on a different server and then blending results obtained on each.
What are the advantages of version control?
The main advantages of version control are –
- It allows you to compare files, identify differences, and consolidate the changes seamlessly.
- It helps to keep track of application builds by identifying which version is under which category – development, testing, QA, and production.
- It maintains a complete history of project files that comes in handy if ever there’s a central server breakdown.
- It is excellent for storing and maintaining multiple versions and variants of code files securely.
- It allows you to see the changes made in the content of different files.
Explain the typical data analysis process.
Data analysis deals with collecting, inspecting, cleansing, transforming and modelling data to glean valuable insights and support better decision making in an organization. The various steps involved in the data analysis process include –
Data Exploration –
Having identified the business problem, a data analyst has to go through the data provided by the client to analyse the root cause of the problem.
Data Preparation
This is the most crucial step of the data analysis process wherein any data anomalies (like missing values or detecting outliers) with the data have to be modelled in the right direction.
Data Modelling
The modelling step begins once the data has been prepared. Modelling is an iterative process wherein the model is run repeatedly for improvements. Data modelling ensures that the best possible result is found for a given business problem.
Validation
In this step, the model provided by the client and the model developed by the data analyst are validated against each other to find out if the developed model will meet the business requirements.
What do you mean by DBMS? What are its different types?
A Database Management System (DBMS) is a software application that interacts with the user, applications and the database itself to capture and analyze data. The data stored in the database can be modified, retrieved and deleted, and can be of any type like strings, numbers, images etc.
There are mainly 4 types of DBMS, which are Hierarchical, Relational, Network, and Object-Oriented DBMS.
Hierarchical DBMS: As the name suggests, this type of DBMS has a style of predecessor-successor type of relationship. So, it has a structure similar to that of a tree, wherein the nodes represent records and the branches of the tree represent fields.
Relational DBMS (RDBMS): This type of DBMS, uses a structure that allows the users to identify and access data in relation to another piece of data in the database.
Network DBMS: This type of DBMS supports many to many relations wherein multiple member records can be linked.
Object-oriented DBMS: This type of DBMS uses small individual software called objects. Each object contains a piece of data and the instructions for the actions to be done with the data.