Skip to main content
Back

Statistics for Computer Science Basics: Syllabus and Core Concepts

Study Guide - Smart Notes

Tailored notes based on your materials, expanded with key definitions, examples, and context.

Statistics for Computer Science Basics

Course Overview

This course introduces students to the fundamental concepts of statistics, with a focus on applications in business, computer science, and data analysis. Students learn to summarize and graphically represent data, analyze measures of central tendency, and interpret relationships between variables.

  • Course Materials: Berenson / Levine / Szabat / Stephan: Basic Business Statistics, 14th Global Edition; MyLabStat online platform.

  • Assessment: Homework, project presentation, midterm, final exam, and in-class activity.

Professional Competencies

  • Data & Variables: Define types/scales of data, plan and assess data collection and quality.

  • Visualization: Organize, summarize, and present data using tables, charts, and diagrams.

  • Descriptive Measures: Compute and interpret summary statistics (mean, median, mode, etc.).

  • Probability & Distributions: Apply probability rules; use and understand discrete and continuous models.

  • Sampling & Inference: Understand sampling, confidence intervals, and hypothesis testing.

  • Information & Decision: Interpret statistical results and make data-driven decisions.

Methodological Competencies

  • Build clear, correctly labeled tables and charts in Excel/LibreOffice.

  • Select meaningful data (variables, record consistently) and interpret results in context.

  • Summarize and visualize data using Python.

Personal/Social Competencies

  • Communicate statistical results clearly and tailored to the audience (purpose, technical depth, and format).

Course Structure & Weekly Topics

The course is organized into weekly modules, each focusing on a key area of statistics relevant to business and computer science.

Week

Topics

1

Introductions, Course overview and expectations, Types of data, Collecting and cleaning data

2

Visualizing data

3

Central Tendency and Variation

4

Quartiles, Covariance, Correlation

5

Probability

6

Discrete Probability Distributions, Q&A before midterm exam

7

Continuous Probability Distributions

8

Sampling and Estimation

9

Confidence Intervals

10

Hypothesis Testing

11

Hypothesis Testing (continued)

12

Hypothesis Testing (continued), Project work

13

Projects presentation

14

Revision (all topics)

Key Statistical Concepts

Types of Data

Understanding the types of data is essential for selecting appropriate statistical methods.

  • Qualitative (Categorical) Data: Data that can be grouped by categories (e.g., gender, color).

  • Quantitative (Numerical) Data: Data that can be measured numerically (e.g., height, income).

  • Scales of Measurement: Nominal, ordinal, interval, and ratio scales.

Data Collection and Cleaning

Accurate data collection and cleaning are crucial for reliable analysis.

  • Data Collection: Methods include surveys, experiments, and observational studies.

  • Data Cleaning: Removing errors, handling missing values, and ensuring consistency.

Visualizing Data

Data visualization helps in understanding patterns and relationships.

  • Charts and Graphs: Bar charts, histograms, pie charts, scatter plots.

  • Tables: Organize data for clarity and comparison.

Measures of Central Tendency

Central tendency describes the center of a data set.

  • Mean: The average value.

  • Median: The middle value when data are ordered.

  • Mode: The most frequently occurring value.

Measures of Variation

Variation measures the spread of data.

  • Range: Difference between the highest and lowest values.

  • Variance: Average squared deviation from the mean.

  • Standard Deviation: Square root of variance.

Quartiles, Covariance, and Correlation

These measures help describe data distribution and relationships between variables.

  • Quartiles: Divide data into four equal parts.

  • Covariance: Measures how two variables change together.

  • Correlation: Standardized measure of relationship strength.

Probability

Probability quantifies the likelihood of events.

  • Basic Rules: Addition and multiplication rules.

  • Conditional Probability: Probability of event A given event B.

Discrete and Continuous Probability Distributions

Probability distributions describe how probabilities are distributed over values.

  • Discrete Distributions: Binomial, Poisson, etc.

  • Continuous Distributions: Normal, exponential, etc.

Sampling and Estimation

Sampling allows inference about populations from samples.

  • Sampling Methods: Simple random, stratified, cluster sampling.

  • Estimation: Point and interval estimates of population parameters.

Confidence Intervals

Confidence intervals provide a range of values for population parameters.

  • Formula for Confidence Interval (mean):

Hypothesis Testing

Hypothesis testing is used to make decisions about population parameters.

  • Steps: State hypotheses, select significance level, compute test statistic, make decision.

  • Test Statistic Example (z-test):

Communicating Results

Effective communication of statistical findings is essential for decision-making.

  • Tailor reports: Adjust technical depth and format for the audience.

  • Use clear visuals: Support findings with tables and charts.

Additional info: The course emphasizes practical skills in Excel/LibreOffice and Python for data analysis, and includes project work and presentations to develop communication skills.

Pearson Logo

Study Prep