TeraLab offers a custom, dedicated space for Data Science experimentations, including powerful computing nodes, GPU, and popular tools and applications, such as JupyterLab, Python or R. For TeraLab, data security and sovereignty are essential, so several levels of autonomy within the workspace are proposed.
SERVICE DESCRIPTION
TeraLab will provide you with a secure, isolated working environment, called workspace. A workspace is highly customizable in terms of CPU, RAM, GPU (subject to availability), hot storage, cold storage (backup and archive), Linux distribution, and security configuration.
TeraLab favours the most popular tools and applications for Data Science, in particular libre offers, including:
- Python
- Anaconda
- Scikit-learn
- JupyterLab (single or multiple environments)
- R
- PyTorch
- TensorFlow Whenever required, the service can be coupled with distributed processing tools, like Hadoop/Spark.
Data security is a priority for TeraLab, thus, we offer three types of workspace, depending on the level of autonomy required for the user:
- Autonomous: If the user has system administration/security skills, or the data has no confidentiality restrictions. This is the suggested choice when processing public data.
- Administered: If the user has limited administration skills, and/or the data has some confidentiality restrictions. Data transfers out of the workspace must follow a procedure.
- Highly secured: Reserved for projects where data security and confidentiality are a top priority.
SPECIAL ACCESS CONDITIONS
Acceptance of TeraLab's Terms and Conditions.
PREREQUISITES
No
CASE EXAMPLES
An SME developping their own machine learning algorithm but lacking poweful nodes to train it faster.
SUCCESS STORY
Santeclair
This French SME used TeraLab's data science environment in the domain of healthcare insurance, to detect fraud. Together we were able to show that data science technologies were not restricted to large corporations. The project was built from scratch and brought up to their production environment.
SERVICE CAN BE COMBINED WITH
Big Data Prototyping Infrastructure.
ADDITIONAL INFORMATION
Users of this service are kindly asked to include in their proposal an estimation of their required infrastructure (number of VMs, CPU, RAM and Storage for each VM).