Skip to content

dbenesj/whoville

 
 

Repository files navigation

Hortonworks HDP 2.6.3+ / HDF 3.0+ Simple Autodeployment

A set of quick deployment scripts and supporting artefacts to deploy Hortonworks HDP/HDF demo Sandboxes

To Deploy: New HDF3 install with HDF3 Example application: Streaming Trucking Demo

Status: Tested with build HDP 2.6.3 / Ambari 2.5.1 / HDF mpack 3.0.1.1-5

  • Pre-reqs:

    • Launch a single vanilla Centos/RHEL 7.x VM (e.g. on local VM or openstack or cloud provider of choice)
    • The VM should not already have any Ambari or HDP components installed (e.g. do NOT run script on HDP sandbox)
    • The VM requires 4 vcpus and ~17-18 GB RAM once all services are running and you execute a query, so m3.2xlarge size is recommended
  • Login to the instance and run:

curl -sSL https://raw.githubusercontent.com/harshn08/whoville/master/deploy_generic_SAMTruckingDemo_fromscratch.sh | sudo -E bash

Once the script completes (about 30min), you can start reviewing the Registry, NiFi, SAM, Storm UIs. However, you will need to wait for additional 20-30min for Druid to index the data before you can start creating Superset dashboards against the Druid cubes.

Login details
  • Ambari port: 8080 login: admin/StrongPassword
  • Supserset port: 9089 login: admin/StrongPassword
Demo walkthrough

Detailed walkthrough available here

What is automated
  • Ambari install
  • HDF Mpack install
  • HDP+HDF install
  • Create demo artifacts:
    • Kafka topics for trucking
    • SR schemas for trucking
    • Nifi flow for trucking
    • SAM artifacts
    • SAM flow
    • Hbase/Phoenix tables
    • Trucking simulator
What is not automated
  • Import of Supserset dashboard
    • This can be manually created using steps in docs or tutorial
Older versions

Previous README available here

Releases

No releases published

Packages

No packages published

Languages

  • Shell 100.0%