Deepnote
Deepnote
Deepnote.com
Our story
Join the team
Changelog
What is Deepnote
Deepnote crash course
💡 features
Command palette
Real-time collaboration
Comments
Integrations
Variable explorer
Terminal
Versioning
Code intelligence
🤝 Collaboration
Teams
Sharing a project
Publishing a notebook
⚙️ Environment
Selecting hardware
Custom initialization
Custom environments
Long-running jobs
Environment variables
Pre-installed packages
Python requirements
🔧 Integrations
GitHub
MongoDB
PostgreSQL
AWS S3
Deepnote Shared Datasets
Google BigQuery
Snowflake
Google Cloud Storage
Spark
📥 Import & export
Importing data to Deepnote
Share or embed a cell
Export a notebook
Export a project
🏷️ resources
Keyboard shortcuts
Tracking experiments
Intellectual property
Launch repositories in Deepnote
Pricing
Changelog
Deepnote Community
Terms & Conditions
Security
Powered by GitBook

Spark

Pyspark

To use pyspark in Deepnote, you need to install a JDK first.

! sudo apt-get update
! sudo mkdir -p /usr/share/man/man1
! sudo apt-get install -y openjdk-11-jdk
! pip install pyspark

Now you can test it in a notebook cell with:

from pyspark import SparkContext
sc = SparkContext("local", "First App")

or in a terminal by opening an interactive shell:

pyspark
🔧 Integrations - Previous
Google Cloud Storage
Next - 📥 Import & export
Importing data to Deepnote
Last updated 3 months ago