Information about authentication can be found in section Authentication.
Scenario framework has default templates for testing Sahara. For use them, need to specify plugin and version (for transient check and fake plugin, version is not necessary):
$ tox -e venv -- sahara-scenario -p vanilla -v 2.7.1
Create the YAML and/or the YAML mako template files for scenario tests etc/scenario/simple-testcase.yaml. You can take a look at sample YAML files How to write scenario files.
If you want to run scenario tests for one plugin, you should use the YAML files with a scenario for the specific plugin:
$ tox -e venv -- sahara-scenario etc/scenario/simple-testcase.yaml
or, if the file is a YAML Mako template:
$ tox -e venv -- sahara-scenario -V templatevars.ini sahara_tests/scenario/defaults/vanilla-2.7.1.yaml.mako
where templatevars.ini contains the values of the variables referenced by vanilla-2.7.1.yaml.mako.
For example, you want to run tests for the Vanilla plugin with the Hadoop version 2.7.1 In this case you should create templatevars.ini with the appropriate values (see the section Variables and defaults templates) and use the following tox env:
$ tox -e venv -- sahara-scenario -V templatevars.ini sahara_tests/scenario/defaults/vanilla-2.7.1.yaml.mako
If you want to run scenario tests for a few plugins or their versions, you should use the several YAML and/or YAML Mako template files:
$ tox -e venv -- sahara-scenario -V templatevars.ini sahara_tests/scenario/defaults/cdh-5.4.0.yaml.mako sahara_tests/scenario/defaults/vanilla-2.7.1.yaml.mako ...
Here are a few more examples.
$ tox -e venv -- sahara-scenario -V templatevars.ini sahara_tests/scenario/defaults/credentials.yaml.mako sahara_tests/scenario/defaults/vanilla-2.7.1.yaml.mako
will run tests for Vanilla plugin with the Hadoop version 2.7.1 and credential located in sahara_tests/scenario/defaults/credentials.yaml.mako, replacing the variables included into vanilla-2.7.1.yaml.mako with the values defined into templatevars.ini. For more information about writing scenario YAML files, see the section section How to write scenario files.
tox -e venv -- sahara-scenario sahara_tests/scenario/defaults will run tests from the test directory.
Also, you can validate your yaml-files using flag --validate via command:
$ tox -e venv -- sahara-scenario --validate -V templatevars.ini sahara_tests/scenario/defaults/credantials.yaml.mako sahara_tests/scenario/defaults/vanilla-2.7.1.yaml.mako
For generating report use flag –report.
You can set authentication variables in three ways:
List of variables:
- OS_USERNAME
- OS_PASSWORD
- OS_PROJECT_NAME
- OS_AUTH_URL
List of flags:
--os-username
--os-password
--os-project-name
--os-auth-url
Create a clouds.yaml file containing your cloud information. os-client-config will look for that file in the home directory, then ~/.config/openstack then /etc/openstack. It’s also possible to set OS_CLIENT_CONFIG_FILE environment variable to that file’s absolute path. After creating the file, you can set OS_CLOUD variable or --os-cloud flag to the name of the cloud you have created and those values will be used.
Example of a clouds.yaml file:
clouds:
scenario_cloud:
auth:
username: admin
password: nova
project_name: admin
auth_url: http://localhost:5000/v2.0
Using this example, OS_CLOUD or --os-cloud value would be scenario_cloud. Note that more than one cloud can be defined in the same file.
More information can be found here
The variables used in the Mako template files are replaced with the values from a config file, whose name is passed to the test runner through the -V parameter.
The format of the config file is an INI-style file, as accepted by the Python ConfigParser module. The key/values must be specified in the DEFAULT section.
Example of template variables file:
[DEFAULT]
network_type: neutron
network_private_name: private
...
The following variables are currently used by defaults templates:
Variable | Type | Value |
---|---|---|
network_type | string | neutron or nova-network |
network_private_name | string | private network name for OS_PROJECT_NAME |
network_public_name | string | public network name |
<plugin_name_version>_name | string | name of the image to be used for the specific plugin/version |
{ci,medium,large}_flavor_id | string | IDs of flavor with different size |
You can write all sections in one or several files, which can be simple YAML files or YAML-based Mako templates (.yaml.mako or yml.mako).
This field has integer value, and set concurrency for run tests
For parallel testing use flag --count in run command and setup cuncurrency value
This section is dictionary-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
sahara_service_type | string | data-processing | service type for sahara | |
sahara_url | string | None | url of sahara | |
ssl_cert | string | None | ssl certificate for all clients | |
ssl_verify | boolean | False | enable verify ssl for sahara |
This section is dictionary-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
private_network | string | True | private | name or id of private network |
public_network | string | True | public | name or id of private network |
type | string | neutron | “neutron” or “nova-network” | |
auto_assignment_floating_ip | boolean | False |
This sections is an array-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
plugin_name | string | True | name of plugin | |
plugin_version | string | True | version of plugin | |
image | string | True | name or id of image | |
image_username | string | username for registering image | ||
existing_cluster | string | cluster name or id for testing | ||
key_name | string | name of registered ssh key for testing cluster | ||
node_group_templates | object | see section “node_group_templates” | ||
cluster_template | object | see section “cluster_template” | ||
cluster | object | see section “cluster” | ||
scaling | object | see section “scaling” | ||
timeout_check_transient | integer | 300 | timeout for checking transient | |
timeout_poll_jobs_status | integer | 1800 | timeout for polling jobs state | |
timeout_delete_resource | integer | 300 | timeout for delete resource | |
timeout_poll_cluster_status | integer | 3600 | timeout for polling cluster state | |
scenario | array | [‘run_jobs’, ‘scale’, ‘run_jobs’] | array of checks | |
edp_jobs_flow | string | name of edp job flow | ||
hdfs_username | string | hadoop | username for hdfs | |
retain_resources | boolean | False |
This section is an array-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
name | string | True | name for node group template | |
flavor | string or object | True | name or id of flavor, or see section “flavor” | |
node_processes | string | True | name of process | |
description | string | Empty | description for node group | |
volumes_per_node | integer | 0 | minimum 0 | |
volumes_size | integer | 0 | minimum 0 | |
auto_security_group | boolean | True | ||
security_group | array | security group | ||
node_configs | object | name_of_config_section: config: value | ||
availability_zone | string | |||
volumes_availability_zone | string | |||
volume_type | string | |||
is_proxy_gateway | boolean | False | use this node as proxy gateway | |
edp_batching | integer | count jobs | use for batching jobs |
This section is an dictionary-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
name | string | auto-generate | name for flavor | |
id | string | auto-generate | id for flavor | |
vcpus | integer | 1 | number of VCPUs for the flavor | |
ram | integer | 1 | memory in MB for the flavor | |
root_disk | integer | 0 | size of local disk in GB | |
ephemeral_disk | integer | 0 | ephemeral space in MB | |
swap_disk | integer | 0 | swap space in MB |
This section is dictionary-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
name | string | True | name for cluster template | |
description | string | Empty | description | |
cluster_configs | object | name_of_config_section: config: value | ||
node_group_templates | object | True | name_of_node_group: count | |
anti_affinity | array | Empty | array of roles |
This section is dictionary-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
name | string | True | Empty | name for cluster |
description | string | Empty | description | |
is_transient | boolean | False | value |
This section is an array-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
operation | string | True | “add” or “resize” | |
node_group | string | True | Empty | name of node group |
size | integer | True | Empty | count node group |
This section has an object with a name from the section “clusters” field “edp_jobs_flows” Object has sections of array-type. Required: type
Fields | Type | Required | Default | Value |
---|---|---|---|---|
type | string | True | “Pig”, “Java”, “MapReduce”, “MapReduce.Streaming”, “Hive”, “Spark”, “Shell” | |
input_datasource | object | see section “input_datasource” | ||
output_datasource | object | see section “output_datasource” | ||
main_lib | object | see section “main_lib” | ||
additional_libs | object | see section “additional_libs” | ||
configs | dict | Empty | config: value | |
args | array | Empty | array of args |
Required: type, source This section is dictionary-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
type | string | True | “swift”, “hdfs”, “maprfs” | |
hdfs_username | string | username for hdfs | ||
source | string | True | uri of source |
Required: type, destination This section is dictionary-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
type | string | True | “swift”, “hdfs”, “maprfs” | |
destination | string | True | uri of source |
Required: type, source This section is dictionary-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
type | string | True | “swift or “database” | |
source | string | True | uri of source |
Required: type, source This section is an array-type.
Fields | Type | Required | Default | Value |
---|---|---|---|---|
type | string | True | “swift or “database” | |
source | string | True | uri of source |