Ardy Deploy package

class ardy.core.deploy.deploy.Deploy(*args, **kwargs)

Bases: ardy.config.ConfigMixin

build = None
build_artefact(src_project=None)

Run deploy the lambdas defined in our project. Steps: * Build Artefact * Read file or deploy to S3. It’s defined in config[“deploy”][“deploy_method”]

Parameters:src_project – str. Name of the folder or path of the project where our code lives
Returns:bool
deploy()
Upload code to AWS Lambda. To use this method, first, must set the zip file with code with
self.set_artefact(code=code). Check all lambdas in our config file or the functions passed in command line and exist in our config file. If the function is upload correctly, update/create versions, alias and triggers
Returns:True
static is_client_result_ok(result)
lambdas_to_deploy = []
remote_create_lambada(**kwargs)
remote_get_lambda(**kwargs)
remote_list_lambdas()
remote_publish_version(**kwargs)
remote_update_alias(**kwargs)
remote_update_code_lambada(**kwargs)
remote_update_conf_lambada(**kwargs)
run(src_project=None, path_to_zip_file=None)

Run deploy the lambdas defined in our project. Steps: * Build Artefact * Read file or deploy to S3. It’s defined in config[“deploy”][“deploy_method”] * Reload conf with deploy changes * check lambda if exist

  • Create Lambda
  • Update Lambda
Parameters:
  • src_project – str. Name of the folder or path of the project where our code lives
  • path_to_zip_file – str.
Returns:

bool

set_artefact(code)
Parameters:code – dic. it must be with this shape

{‘ZipFile’: } or {‘S3Bucket’: deploy_bucket, ‘S3Key’: s3_keyfile, } :return:

set_artefact_path(path_to_zip_file)

Set the route to the local file to deploy :param path_to_zip_file: :return: