Advertisement
Guest User

Untitled

a guest
Aug 23rd, 2019
195
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 1.70 KB | None | 0 0
  1. # Our current execution role is required when creating the model as the training
  2. # and inference code will need to access the model artifacts.
  3. role = get_execution_role()
  4.  
  5. # We need to retrieve the location of the container which is provided by Amazon for using XGBoost.
  6. # For convenience, the training and inference code both use the same container.
  7. container = get_image_uri(session.boto_region_name, 'xgboost')
  8.  
  9. # Create a SageMaker estimator using the container location determined in the previous cell.
  10. # It is recommended that we use a single training instance of type ml.m4.xlarge. It is also
  11. # recommended that we use 's3://{}/{}/output'.format(session.default_bucket(), prefix) as the
  12. # output path.
  13.  
  14.  
  15. xgb = sagemaker.estimator.Estimator(container, # The location of the container we wish to use
  16. role, # What is our current IAM Role
  17. train_instance_count=1, # How many compute instances
  18. train_instance_type='ml.m4.xlarge', # What kind of compute instances
  19. output_path='s3://{}/{}/output'.format(session.default_bucket(), prefix),
  20. sagemaker_session=session)
  21.  
  22. # Set the XGBoost hyperparameters in the xgb object.
  23. # We have a binary label so we should be using the 'binary:logistic' objective.
  24.  
  25. xgb.set_hyperparameters(max_depth=5,
  26. eta=0.2,
  27. gamma=4,
  28. min_child_weight=6,
  29. subsample=0.8,
  30. silent=0,
  31. objective='binary:logistic',
  32. early_stopping_rounds=10,
  33. num_round=500)
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement