Ansible Configurations

This repo is an example of using Ansible to manage multiple data center domains.

There are examples for ACI, NXOS, and UCS. Each example provides a check/monitor and a create/update/remove component to the aspect of the domain that is being managed.

  • Technology stack: The primary technology is Ansible, with modules that are available as part of Ansible core and modules that are open sourced.

Additionally there is python code for a simple Flask based webservice that listens for a payload from Github, that is sent on a push event.

  • Status: This is Beta code that will change as additional configurations and checks are added to the repository.

Walk, Run, Fly

Walk

Use the check.yml files in each domain to check for the specific configuration status. For example VLANs configured on UCS and NXOS and a tenant configured on ACI.

Run

Use the site.yml files in each domain to add/update/remove the specific configurations in each domain. For example add/update/delete VLANs on UCS and NXOS and a tenant on ACI.

Fly

Enable the Flask based weblistener to process the payload from a Github push event webhook. Additionally the Flask weblistener can send the results of the push event processing to a Webex Teams room.

Installation

  • Ansible is required, Version 2.7.10 was used when developing this repository
  • Create a Python3 virtual environment
  • Activate the virtual environment
  • Pip install
    • ucsmsdk to enable Ansible to interact with UCS Manager
    • webexteamssdk
    • Flask

Configuration

  • Ansible Vault is used to maintain secrets
  • Knowledge of how to use Webex API is required for webhook listener

Usage

To run an ansible playbook use the following command line in the appropriate domain directory.

ansible-playbook -i inventory site.yml --vault-password-file=~/.vault_pass.txt

Because ansible-vault is used the vault password is required to run the playbook in an automated fashion.

Use Case

In this scenario, network management is automated, but the networking for the Cisco Virtual Interface Cards in the UCS Servers is not, and requires a manual step in the application provisioning process. The network VLAN management requires a VLAN to be added to the global UCS fabric, and it must also be added to the VNIC templates of the VNIC interfaces for each server in the ESX cluster where application virtual machines might reside.

The existing network automation uses Ansible, and administrators want to use the same methodology throughout the data center. Any new Ansible modules, playbooks, and supporting software must be compatible with the existing Ansible implementation. All the Ansible playbooks are maintained in a Github and are automatically run upon the completion of a git push. In addition, automated deployment of an Application Profile in ACI along with the UCS and NX-OS VLAN streamlines the entire application deployment process.

Objectives

  • In this use case, Ansible works as a single configuration for all the platforms and manages multi-domain networking across UCS, Nexus, and ACI.
  • Application Profile in ACI along with the UCS and NX-OS VLAN streamlines the entire application deployment process.

Requirements

  • Ansible 2.6+
  • Virtual Machine or Hyper-V (if you are a Windows user).

Topology

Diagram

Business Summary
This use case demonstrates Data Center multi-domain automation with a common toolset, in this case, Ansible. A common toolset and methodolody enables a quicker time to results.

Learning Labs

Introduction to ACI and Ansible

View code on GitHub

Code Exchange Community

Get help, share code, and collaborate with other developers in the Code Exchange community.View Community
Disclaimer:
Cisco provides Code Exchange for convenience and informational purposes only, with no support of any kind. This page contains information and links from third-party websites that are governed by their own separate terms. Reference to a project or contributor on this page does not imply any affiliation with or endorsement by Cisco.