Events2Join

Using Openshift — Dataiku DSS 13 documentation


Using Openshift — Dataiku DSS 13 documentation

Using Openshift¶ ... You can use containerized execution using Openshift 4 as the underlying Kubernetes engine. Dataiku leverages Kubernetes as a pure base ...

Dataiku DSS - Reference documentation — Dataiku DSS 13 ...

The Developer Guide contains all information for developers using Dataiku: how to code in Dataiku, how to create applications, how to operate Dataiku through ...

Elastic AI computation — Dataiku DSS 13 documentation

You are viewing the documentation for version 13 of DSS. » Elastic AI computation Open page in a new tab. Elastic AI computation ...

DSS 13 Release notes - Dataiku Documentation

You are viewing the documentation for version 13 of DSS. » Release notes Open page in a new tab. Release notes¶.

Concepts — Dataiku DSS 13 documentation

In general, running Dataiku DSS as a container (either by running Docker directly, or through Kubernetes) is incompatible with the ability to leverage ...

Initial setup — Dataiku DSS 13 documentation

Many Kubernetes setups will be based on managed Kubernetes clusters handled by your Cloud Provider. DSS provides deep integrations with these.

Managed Kubernetes clusters — Dataiku DSS 13 documentation

In this case, you can use the variables expansion mechanism of DSS. To denote the contextual cluster to use at the project level, use the syntax ${variable_name ...

Unmanaged Kubernetes clusters — Dataiku DSS 13 documentation

To use a single unmanaged cluster, you must have an existing Kubernetes cluster that is running version 1.10 or later.

Using code envs with containerized execution — Dataiku DSS 13 ...

After each upgrade of DSS, you must rebuild all base images and then all code env images. You can rebuild code env images by running ./bin/dssadmin build- ...

Containerized DSS engine - Dataiku Documentation

The recipes will run with the engine named “DSS”, implying that the DSS node will be the one providing the compute resources. This can lead to over-consumption ...

Deploying on Kubernetes — Dataiku DSS 13 documentation

Using the API Deployer, you can deploy your API services to a Kubernetes cluster. Each API Service Deployment (see Concepts) is setup on Kubernetes as: A ...

OpenShift Container Platform 4.13 - Red Hat Documentation

Highlights of what is new and what has changed with this OpenShift Container Platform release. Security and compliance. Learning about and managing security ...

Troubleshooting — Dataiku DSS 13 documentation

If your see the error above in a Spark on Kubernetes container, you will need to set spark.driver.host to the DNS name or IP address of DSS backend. You can do ...

Dataiku — SQreamDB 4.8 documentation

Amazon S3 connection set up in DSS. Python 3.9. Establishing a Dataiku Connection . In your Dataiku web interface: Upload the plugin from ...

Dynamic namespace management — Dataiku DSS 13 documentation

In Kubernetes, the namespace is the unit for access control and resources control. DSS can either use a single namespace, multiple static namespaces, or ...

Using managed EKS clusters — Dataiku DSS 13 documentation

To use Amazon Elastic Kubernetes Service (EKS), begin by installing the “EKS clusters” plugin from the Plugins store in Dataiku DSS.

Welcome | About | OpenShift Container Platform 4.13

Use the left navigation bar to browse the documentation. Select the task that interests you from the contents of this Welcome page. Start with Architecture and ...

Exploring Dataiku DSS 12.6: Harness the Power of Generative AI

In order to enable the features we'll be reviewing, it's necessary to start in the Administrative settings of your DSS instance. If you don't ...

Using managed AKS clusters — Dataiku DSS 13 documentation

To use Microsoft Azure Kubernetes Service (AKS), begin by installing the “AKS clusters” plugin from the Plugins store in Dataiku DSS.

Using Docker instead of Kubernetes - Dataiku Documentation

In addition to pushing to Kubernetes, DSS can leverage standalone Docker daemons. This is a very specific setup, and we recommend using Kubernetes preferably.