Authentication - Connecting to a Secure Cluster

This feature is available only in the Enterprise version of SnappyData.

Authentication is the process of verifying someone's identity. When a user tries to log in, that request is forwarded to the specified LDAP directory to verify if the credentials are correct. There are a few different ways to connect to a secure cluster using either JDBC (Thin) Client, Smart Connector Mode and Snappy Jobs. Accessing a secure cluster requires users to provide their user credentials.

Using JDBC (Thin) Client

When using the JDBC client, provide the user credentials using connection properties 'user' and 'password'.

Example: JDBC Client

val props = new Properties()
props.setProperty("user", username);
props.setProperty("password", password);

val url: String = s"jdbc:snappydata://localhost:1527/"
val conn = DriverManager.getConnection(url, props)

Example: Snappy shell

connect client 'localhost:1527;user=user1;password=user123';

Using ODBC Driver

You can also connect to the SnappyData Cluster using SnappyData ODBC Driver using the following command:

Driver=SnappyData ODBC Driver;server=<ServerHost>;port=<ServerPort>;user=<userName>;password=<password>

For more information refer to, How to Connect using ODBC Driver.

Using Smart Connector Mode

In Smart Connector mode, provide the user credentials as Spark configuration properties named and

In the below example, these properties are set in the SparkConf which is used to create SnappyContext in your job.

val conf = new SparkConf()
    .setAppName("My Spark Application with SnappyData")
    .set("spark.executor.cores", TestUtils.defaultCores.toString)
    .set("snappydata.connection", snappydataLocatorURL)
    .set("", username)
    .set("", password)
val sc = SparkContext.getOrCreate(conf)
val snc = SnappyContext(sc)

The below example demonstrates how to connect to the cluster via Spark shell using the --conf option to specify the properties.

    --master local[*] 
    --conf spark.snappydata.connection=localhost:1527 

Alternatively, you can specify the user credentials in the Spark conf file.
Spark reads these properties when you lauch the spark-shell or invoke spark-submit. To do so, specify the user credentials in the spark-defaults.conf file, located in the conf directory.

In this file, you can specify:     <username> <password>

Using Snappy Jobs

When submitting Snappy jobs using, provide user credentials through a configuration file using the option --passfile.

For example, a sample configuration file is provided below:

$ cat /home/user1/snappy/job.config 
-u user1:password

In the below example, the above configuration file is passed when submitting a job.

$./bin/ submit  \
    --lead localhost:8090  \
    --app-name airlineApp \
    --class  io.snappydata.examples.CreateAndLoadAirlineDataJob \
    --app-jar $SNAPPY_HOME/examples/jars/quickstart.jar \
    --passfile /home/user1/snappy/job.config


  • When checking the status of a job using status --jobid, provide user credentials through a configuration file using the option --passfile

  • Only trusted users should be allowed to submit jobs, as an untrusted user may be able to do harm through jobs by invoking internal APIs which can bypass the authorization checks.

  • Currently, SparkJobServer UI may not be accessible when security is enabled, but you can use the script to access any information required using commands like status, listcontexts, etc.
    Execute ./bin/ for more details.

  • The configuration file should be in a secure location with read access only to an authorized user.

  • These user credentials are passed to the Snappy instance available in the job, and it will be used to authorize the operations in the job.