- latest TIC changes from github

parent 079a5b10
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- Browser [e.g. chrome, safari]
- Version [e.g. 22]
**Smartphone (please complete the following information):**
- Device: [e.g. iPhone6]
- OS: [e.g. iOS8.1]
- Browser [e.g. stock browser, safari]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
inventory/hosts
\ No newline at end of file
---
# Based on ansible-lint config
extends: default
rules:
braces:
max-spaces-inside: 1
level: error
brackets:
max-spaces-inside: 1
level: error
colons:
max-spaces-after: -1
level: error
commas:
max-spaces-after: -1
level: error
comments: disable
comments-indentation: disable
document-start: disable
empty-lines:
max: 3
level: error
hyphens:
level: error
indentation: disable
key-duplicates: enable
line-length: disable
new-line-at-end-of-file: disable
new-lines:
type: unix
trailing-spaces: disable
truthy: disable
---
# ansible-playbook -v 011.initialize_hosts.yml -u root
- name: Initialize the host machines
hosts: all
gather_facts: yes
roles:
- init/common_packages
- init/docker
- init/post_installation
\ No newline at end of file
---
# ansible-playbook -v 012.prepare_docker_images.yml -u root
- name: Pull and Build Relevent Docker Images
hosts: all
gather_facts: no
roles:
- init/images
\ No newline at end of file
---
# ansible-playbook -v 013.mount_fs.yml -u root
- name: Mount the GlusterFS on each node of the swarm
hosts: all
gather_facts: yes
roles:
- init/mountfs
\ No newline at end of file
---
# ansible-playbook -v 014.purge_swarm.yml -u root
- name: Initialize the host machines
hosts: all
gather_facts: no
become: yes
become_user: "root"
tasks:
- name: Remove docker services
shell: 'docker service rm $(docker service ls -q)'
ignore_errors: yes
when: "inventory_hostname in groups.swarm_manager_prime"
- name: Remove any orphan containers on any machine
shell: 'docker rm $(docker ps -aq)'
ignore_errors: yes
- name: Leave swarm for a node
docker_swarm:
state: absent
when: "inventory_hostname in groups.swarm_workers"
- name: Remove node from swarm
docker_swarm:
shell: "docker node rm $(docker node list -q)"
ignore_errors: yes
when: "inventory_hostname in groups.swarm_workers"
- name: Remove swarm managers
docker_swarm:
state: absent
force: true
when: "inventory_hostname in groups.swarm_managers"
- name: Remove data for all services
shell: "rm -rf /root/hlft-store/*"
\ No newline at end of file
---
# ansible-playbook -v 014.spawn_swarm.yml -u root
- name: Create and spawn a docker swarm
hosts: all
gather_facts: yes
roles:
- swarm
---
# ansible-playbook -v 015.deploy_swarm_visualizer.yml --flush-cache -u root
- name: Deploy Docker Swarm Visualizer
hosts: swarm_manager_prime
gather_facts: yes
roles:
- viz/swarm_visualizer
\ No newline at end of file
---
# ansible-playbook -v 016.deploy_portainer.yml --flush-cache -u root
- name: Deploy Portainer
hosts: swarm_manager_prime
gather_facts: yes
roles:
- viz/portainer
\ No newline at end of file
---
# ansible-playbook -v 100.deploy_ca.yml --flush-cache -u root
- name: Spawn a Hyperledger Fabric Topology
hosts: swarm_manager_prime
gather_facts: yes
roles:
- hlf/ca
- hlf/cli/ca
\ No newline at end of file
---
# ansible-playbook -v 101.deploy_orderer.yml --flush-cache -u root
- name: Spawn a Hyperledger Fabric Topology
hosts: swarm_manager_prime
gather_facts: no
roles:
- hlf/cli/orderer
- hlf/orderer
\ No newline at end of file
---
# ansible-playbook -v 102.deploy_peers.yml --flush-cache -u root
- name: Spawn a Hyperledger Fabric Topology
hosts: swarm_manager_prime
gather_facts: no
roles:
- hlf/couchdb
- hlf/peer
- hlf/cli/peer
\ No newline at end of file
---
# ansible-playbook -v 103.deploy_cli.yml --flush-cache -u root
- name: Spawn a Hyperledger Fabric Topology
hosts: swarm_manager_prime
gather_facts: no
roles:
- hlf/cli/cli
\ No newline at end of file
---
# ansible-playbook -v 104.deploy_hlf_explorer.yml --flush-cache -u root
- name: Deploy Hyper ledger explorer service
hosts: swarm_manager_prime
gather_facts: yes
roles:
- hlf_explorer
\ No newline at end of file
---
# ansible-playbook -v 105.deploy_bank_app.yml --flush-cache -u root
- name: Deploy Sample Bank App service
hosts: swarm_manager_prime
gather_facts: yes
roles:
- bank_app
This diff is collapsed.
This diff is collapsed.
# config file for ansible -- https://ansible.com/
# ===============================================
[defaults]
inventory = ./inventory/hosts
callback_whitelist = profile_tasks
# Supresses check og host names in known hosts
host_key_checking = False
# # Mitogen specific config
# strategy_plugins = ./plugins/mitogen-0.2.8/ansible_mitogen/plugins/strategy
# strategy = mitogen_linear
# Dont enable in dev mode
#[ssh_connection]
#pipelining = True
\ No newline at end of file
#!/usr/bin/python
def filter_enumerate(v):
return list(enumerate(v))
class FilterModule (object):
def filters(self):
return {
'enumerate': filter_enumerate,
}
\ No newline at end of file
---
LOG_LEVEL: "INFO"
INSTALL_BANK_CHAINCODE: "y"
###########################################################################################
# #
# #
# GlusterFS Cluster conf vars #
# #
# #
###########################################################################################
# gluster volume
gluster_cluster_volume: "gfs0"
glusterd_version: '7'
###########################################################################################
# #
# #
# Hyperledger Fabric Network config vars #
# #
# #
###########################################################################################
# Organization Details
org:
name: "hlf"
unit: "bityoga"
# Creds of various agents
admin_user: "admin1"
admin_password: "admin1pw"
tlsca_user: "tlsca"
tlsca_password: "tlscapw"
orgca_user: "orgca"
orgca_password: "orgcapw"
orderer_user: "orderer"
orderer_password: "ordererpw"
peer1_user: "peer1"
peer1_password: "peer1pw"
peer2_user: "peer2"
peer2_password: "peer2pw"
couchdb_user: "couchdb"
couchdb_password: "couchdbpw"
hlf_explorer_db_user: "hppoc"
hlf_explorer_db_password: "password"
hlf_explorer_admin_user: "admin"
hlf_explorer_admin_password: "adminpw"
# Name of the swarm network that would host the services
swarm_network: "hlfnet"
######################################### KEY INFO ABOUT PORT MAPPING #############################################
# All services internally run on <<target_port>> and get mapped to the <<publised_port>> sepecified in their respective section bellow
# Orderer: 7050
# Peer: 7051
# CA: 7054
######################################### CAs #############################################
tlsca: { switch: "on", image: "hyperledger/fabric-ca", tag: "1.4", replicas: -1, port: 8081,
path: "/root/{{tlsca_user}}",
db: "{{sqlite}}",
name: "{{tlsca_user}}", password: "{{tlsca_password}}", type: "tls"
}
orgca: { switch: "on", image: "hyperledger/fabric-ca", tag: "1.4", replicas: -1, port: 8052,
path: "/root/{{orgca_user}}",
db: "{{sqlite}}",
name: "{{orgca_user}}", password: "{{orgca_password}}", type: "org"
}
######################################### Orderer #############################################
orderer: { switch: "on", image: "hyperledger/fabric-orderer", tag: "2.2", replicas: -1, port: 8053,
caname: "{{orgca.name}}", anchorpeer: "{{peer1.name}}", anchorport: "{{peer1.port}}",
path: "/root/{{orderer_user}}",
name: "{{orderer_user}}", password: "{{orderer_password}}", type: "orderer"
}
######################################### Peers #############################################
peer1: { switch: "on", image: "hyperledger/fabric-peer", tag: "2.2", replicas: -1, port: 8054,
caname: "{{orgca.name}}", path: "/root/{{peer1_user}}", bootstrap: "",
dbtype: "goleveldb",
name: "{{peer1_user}}", password: "{{peer1_password}}", type: "peer",
leader: "{peer1_user}}"
}
peer2: { switch: "on", image: "hyperledger/fabric-peer", tag: "2.2", replicas: -1, port: 8055,
caname: "{{orgca.name}}", path: "/root/{{peer2_user}}", bootstrap: "{{peer1.name}}:7051",
dbtype: "CouchDB",
name: "{{peer2_user}}", password: "{{peer2_password}}", type: "peer",
leader: "{{peer1_user}}"
}
######################################### CLI #############################################
cli: { switch: "on", image: "hyperledger/fabric-tools", tag: "2.2"}
######################################### DBs #############################################
sqlite: {type: "sqlite3", source: "fabric-ca-server.db"}
couchdb: { switch: "on", image: "couchdb", tag: "2.3", replicas: -1,
path: "/opt/couchdb/data",
name: "{{couchdb_user}}", password: "{{couchdb_password}}"
}
hlf_explorer_db : {
image : "hyperledger/explorer-db",
tag : "1.1.2",
name : 'hlf_explorer_db',
replicas: -1,
db_name : "fabricexplorer",
db_user_name : "{{hlf_explorer_db_user}}",
db_password : "{{hlf_explorer_db_password}}",
port : 5432,
switch: "on",
volume : "pgdata"
}
hlf_explorer : {
image : "hyperledger/explorer",
tag : '1.1.2',
name : 'hlf_explorer',
admin_user: "{{hlf_explorer_admin_user}}",
admin_password: "{{hlf_explorer_admin_password}}",
replicas: -1,
port : 8090,
switch: "on",
volume : "walletstore"
}
swarm_visualizer : {
image : "dockersamples/visualizer",
tag : 'latest',
name : 'swarm_visualizer',
replicas: -1,
port : 9090,
switch: "on",
}
portainer : {
image : "portainer/portainer",
tag : 'latest',
name : 'portainer',
replicas: -1,
port : 9000,
switch: "on",
}
portainer_agent : {
image : "portainer/agent",
tag : 'latest',
name : 'portainer_agent',
port : 9001,
switch: "on",
}
bank_app : {
git_repository : "https://github.com/bityoga/articonf-bank-app.git",
image: "bank-app",
tag : 'latest',
name : 'bank-service',
replicas: -1,
port : 3000,
switch: "on",
}
###########################################################################################
# #
# #
# Service Summary #
# #
# #
###########################################################################################
services:
- "{{caservices}}"
- "{{orderer}}"
- "{{peerservices}}"
- "{{explorerservices}}"
- "{{vizservices}}"
caservices:
- "{{tlsca}}"
- "{{orgca}}"
peerservices:
- "{{peer1}}"
- "{{peer2}}"
explorerservices:
- "{{hlf_explorer_db}}"
- "{{hlf_explorer}}"
vizservices:
- "{{swarm_visualizer}}"
- "{{portainer}}"
- "{{portainer_agent}}"
[all:children]
swarm_manager_prime
swarm_managers
swarm_workers
[swarm_manager_prime]
[swarm_managers]
[swarm_workers]
# This file must be used with "source bin/activate" *from bash*
# you cannot run it directly
if [ "${BASH_SOURCE-}" = "$0" ]; then
echo "You must source this script: \$ source $0" >&2
exit 33
fi
deactivate () {
unset -f pydoc >/dev/null 2>&1
# reset old environment variables
# ! [ -z ${VAR+_} ] returns true if VAR is declared at all
if ! [ -z "${_OLD_VIRTUAL_PATH:+_}" ] ; then
PATH="$_OLD_VIRTUAL_PATH"
export PATH
unset _OLD_VIRTUAL_PATH
fi
if ! [ -z "${_OLD_VIRTUAL_PYTHONHOME+_}" ] ; then
PYTHONHOME="$_OLD_VIRTUAL_PYTHONHOME"
export PYTHONHOME
unset _OLD_VIRTUAL_PYTHONHOME
fi
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
if ! [ -z "${_OLD_VIRTUAL_PS1+_}" ] ; then
PS1="$_OLD_VIRTUAL_PS1"
export PS1
unset _OLD_VIRTUAL_PS1
fi
unset VIRTUAL_ENV
if [ ! "${1-}" = "nondestructive" ] ; then
# Self destruct!
unset -f deactivate
fi
}
# unset irrelevant variables
deactivate nondestructive
VIRTUAL_ENV='/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible'
export VIRTUAL_ENV
_OLD_VIRTUAL_PATH="$PATH"
PATH="$VIRTUAL_ENV/bin:$PATH"
export PATH
# unset PYTHONHOME if set
if ! [ -z "${PYTHONHOME+_}" ] ; then
_OLD_VIRTUAL_PYTHONHOME="$PYTHONHOME"
unset PYTHONHOME
fi
if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT-}" ] ; then
_OLD_VIRTUAL_PS1="${PS1-}"
if [ "x" != x ] ; then
PS1="${PS1-}"
else
PS1="(`basename \"$VIRTUAL_ENV\"`) ${PS1-}"
fi
export PS1
fi
# Make sure to unalias pydoc if it's already there
alias pydoc 2>/dev/null >/dev/null && unalias pydoc || true
pydoc () {
python -m pydoc "$@"
}
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH-}" ] || [ -n "${ZSH_VERSION-}" ] ; then
hash -r 2>/dev/null
fi
# This file must be used with "source bin/activate.csh" *from csh*.
# You cannot run it directly.
# Created by Davide Di Blasi <davidedb@gmail.com>.
set newline='\
'
alias deactivate 'test $?_OLD_VIRTUAL_PATH != 0 && setenv PATH "$_OLD_VIRTUAL_PATH:q" && unset _OLD_VIRTUAL_PATH; rehash; test $?_OLD_VIRTUAL_PROMPT != 0 && set prompt="$_OLD_VIRTUAL_PROMPT:q" && unset _OLD_VIRTUAL_PROMPT; unsetenv VIRTUAL_ENV; test "\!:*" != "nondestructive" && unalias deactivate && unalias pydoc'
# Unset irrelevant variables.
deactivate nondestructive
setenv VIRTUAL_ENV '/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible'
set _OLD_VIRTUAL_PATH="$PATH:q"
setenv PATH "$VIRTUAL_ENV:q/bin:$PATH:q"
if ('' != "") then
set env_name = ''
else
set env_name = '('"$VIRTUAL_ENV:t:q"') '
endif
if ( $?VIRTUAL_ENV_DISABLE_PROMPT ) then
if ( $VIRTUAL_ENV_DISABLE_PROMPT == "" ) then
set do_prompt = "1"
else
set do_prompt = "0"
endif
else
set do_prompt = "1"
endif
if ( $do_prompt == "1" ) then
# Could be in a non-interactive environment,
# in which case, $prompt is undefined and we wouldn't
# care about the prompt anyway.
if ( $?prompt ) then
set _OLD_VIRTUAL_PROMPT="$prompt:q"
if ( "$prompt:q" =~ *"$newline:q"* ) then
:
else
set prompt = "$env_name:q$prompt:q"
endif
endif
endif
unset env_name
unset do_prompt
alias pydoc python -m pydoc
rehash
# This file must be used using `source bin/activate.fish` *within a running fish ( http://fishshell.com ) session*.
# Do not run it directly.
function _bashify_path -d "Converts a fish path to something bash can recognize"
set fishy_path $argv
set bashy_path $fishy_path[1]
for path_part in $fishy_path[2..-1]
set bashy_path "$bashy_path:$path_part"
end
echo $bashy_path
end
function _fishify_path -d "Converts a bash path to something fish can recognize"
echo $argv | tr ':' '\n'
end
function deactivate -d 'Exit virtualenv mode and return to the normal environment.'
# reset old environment variables
if test -n "$_OLD_VIRTUAL_PATH"
# https://github.com/fish-shell/fish-shell/issues/436 altered PATH handling
if test (echo $FISH_VERSION | head -c 1) -lt 3
set -gx PATH (_fishify_path "$_OLD_VIRTUAL_PATH")
else
set -gx PATH "$_OLD_VIRTUAL_PATH"
end
set -e _OLD_VIRTUAL_PATH
end
if test -n "$_OLD_VIRTUAL_PYTHONHOME"
set -gx PYTHONHOME "$_OLD_VIRTUAL_PYTHONHOME"
set -e _OLD_VIRTUAL_PYTHONHOME
end
if test -n "$_OLD_FISH_PROMPT_OVERRIDE"
and functions -q _old_fish_prompt
# Set an empty local `$fish_function_path` to allow the removal of `fish_prompt` using `functions -e`.
set -l fish_function_path
# Erase virtualenv's `fish_prompt` and restore the original.
functions -e fish_prompt
functions -c _old_fish_prompt fish_prompt
functions -e _old_fish_prompt
set -e _OLD_FISH_PROMPT_OVERRIDE
end
set -e VIRTUAL_ENV
if test "$argv[1]" != 'nondestructive'
# Self-destruct!
functions -e pydoc
functions -e deactivate
functions -e _bashify_path
functions -e _fishify_path
end
end
# Unset irrelevant variables.
deactivate nondestructive
set -gx VIRTUAL_ENV '/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible'
# https://github.com/fish-shell/fish-shell/issues/436 altered PATH handling
if test (echo $FISH_VERSION | head -c 1) -lt 3
set -gx _OLD_VIRTUAL_PATH (_bashify_path $PATH)
else
set -gx _OLD_VIRTUAL_PATH "$PATH"
end
set -gx PATH "$VIRTUAL_ENV"'/bin' $PATH
# Unset `$PYTHONHOME` if set.
if set -q PYTHONHOME
set -gx _OLD_VIRTUAL_PYTHONHOME $PYTHONHOME
set -e PYTHONHOME
end
function pydoc
python -m pydoc $argv
end
if test -z "$VIRTUAL_ENV_DISABLE_PROMPT"
# Copy the current `fish_prompt` function as `_old_fish_prompt`.
functions -c fish_prompt _old_fish_prompt
function fish_prompt
# Run the user's prompt first; it might depend on (pipe)status.
set -l prompt (_old_fish_prompt)
# Prompt override provided?
# If not, just prepend the environment name.
if test -n ''
printf '%s%s' '' (set_color normal)
else
printf '%s(%s) ' (set_color normal) (basename "$VIRTUAL_ENV")
end
string join -- \n $prompt # handle multi-line prompts
end
set -gx _OLD_FISH_PROMPT_OVERRIDE "$VIRTUAL_ENV"
end
$script:THIS_PATH = $myinvocation.mycommand.path
$script:BASE_DIR = Split-Path (Resolve-Path "$THIS_PATH/..") -Parent
function global:deactivate([switch] $NonDestructive) {
if (Test-Path variable:_OLD_VIRTUAL_PATH) {
$env:PATH = $variable:_OLD_VIRTUAL_PATH
Remove-Variable "_OLD_VIRTUAL_PATH" -Scope global
}
if (Test-Path function:_old_virtual_prompt) {
$function:prompt = $function:_old_virtual_prompt
Remove-Item function:\_old_virtual_prompt
}
if ($env:VIRTUAL_ENV) {
Remove-Item env:VIRTUAL_ENV -ErrorAction SilentlyContinue
}
if (!$NonDestructive) {
# Self destruct!
Remove-Item function:deactivate
Remove-Item function:pydoc
}
}
function global:pydoc {
python -m pydoc $args
}
# unset irrelevant variables
deactivate -nondestructive
$VIRTUAL_ENV = $BASE_DIR
$env:VIRTUAL_ENV = $VIRTUAL_ENV
New-Variable -Scope global -Name _OLD_VIRTUAL_PATH -Value $env:PATH
$env:PATH = "$env:VIRTUAL_ENV/bin:" + $env:PATH
if (!$env:VIRTUAL_ENV_DISABLE_PROMPT) {
function global:_old_virtual_prompt {
""
}
$function:_old_virtual_prompt = $function:prompt
if ("" -ne "") {
function global:prompt {
# Add the custom prefix to the existing prompt
$previous_prompt_value = & $function:_old_virtual_prompt
("" + $previous_prompt_value)
}
}
else {
function global:prompt {
# Add a prefix to the current prompt, but don't discard it.
$previous_prompt_value = & $function:_old_virtual_prompt
$new_prompt_value = "($( Split-Path $env:VIRTUAL_ENV -Leaf )) "
($new_prompt_value + $previous_prompt_value)
}
}
}
"""Xonsh activate script for virtualenv"""
from xonsh.tools import get_sep as _get_sep
def _deactivate(args):
if "pydoc" in aliases:
del aliases["pydoc"]
if ${...}.get("_OLD_VIRTUAL_PATH", ""):
$PATH = $_OLD_VIRTUAL_PATH
del $_OLD_VIRTUAL_PATH
if ${...}.get("_OLD_VIRTUAL_PYTHONHOME", ""):
$PYTHONHOME = $_OLD_VIRTUAL_PYTHONHOME
del $_OLD_VIRTUAL_PYTHONHOME
if "VIRTUAL_ENV" in ${...}:
del $VIRTUAL_ENV
if "VIRTUAL_ENV_PROMPT" in ${...}:
del $VIRTUAL_ENV_PROMPT
if "nondestructive" not in args:
# Self destruct!
del aliases["deactivate"]
# unset irrelevant variables
_deactivate(["nondestructive"])
aliases["deactivate"] = _deactivate
$VIRTUAL_ENV = r"/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible"
$_OLD_VIRTUAL_PATH = $PATH
$PATH = $PATH[:]
$PATH.add($VIRTUAL_ENV + _get_sep() + "bin", front=True, replace=True)
if ${...}.get("PYTHONHOME", ""):
# unset PYTHONHOME if set
$_OLD_VIRTUAL_PYTHONHOME = $PYTHONHOME
del $PYTHONHOME
$VIRTUAL_ENV_PROMPT = ""
if not $VIRTUAL_ENV_PROMPT:
del $VIRTUAL_ENV_PROMPT
aliases["pydoc"] = ["python", "-m", "pydoc"]
# -*- coding: utf-8 -*-
"""Activate virtualenv for current interpreter:
Use exec(open(this_file).read(), {'__file__': this_file}).
This can be used when you must use an existing Python interpreter, not the virtualenv bin/python.
"""
import os
import site
import sys
try:
abs_file = os.path.abspath(__file__)
except NameError:
raise AssertionError("You must use exec(open(this_file).read(), {'__file__': this_file}))")
bin_dir = os.path.dirname(abs_file)
base = bin_dir[: -len("bin") - 1] # strip away the bin part from the __file__, plus the path separator
# prepend bin to PATH (this file is inside the bin directory)
os.environ["PATH"] = os.pathsep.join([bin_dir] + os.environ.get("PATH", "").split(os.pathsep))
os.environ["VIRTUAL_ENV"] = base # virtual env is right above bin directory
# add the virtual environments libraries to the host python import mechanism
prev_length = len(sys.path)
for lib in "../lib/python3.7/site-packages".split(os.pathsep):
path = os.path.realpath(os.path.join(bin_dir, lib))
site.addsitedir(path.decode("utf-8") if "" else path)
sys.path[:] = sys.path[prev_length:] + sys.path[0:prev_length]
sys.real_prefix = sys.prefix
sys.prefix = base
#!/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from setuptools.command.easy_install import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
#!/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from setuptools.command.easy_install import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
#!/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from setuptools.command.easy_install import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
#!/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from pip._internal.cli.main import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
#!/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from pip._internal.cli.main import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
#!/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from pip._internal.cli.main import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
/usr/local/opt/python/bin/python3.7
\ No newline at end of file
python
\ No newline at end of file
python
\ No newline at end of file
#!/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from wheel.cli import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
#!/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from wheel.cli import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
#!/Users/antorweep/Documents/dev/fabric_as_code/py3-ansible/bin/python
# -*- coding: utf-8 -*-
import re
import sys
from wheel.cli import main
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])
sys.exit(main())
"""Run the EasyInstall command"""
if __name__ == '__main__':
from setuptools.command.easy_install import main
main()
Copyright (c) 2008-2019 The pip developers (see AUTHORS.txt file)
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Metadata-Version: 2.1
Name: pip
Version: 20.0.2
Summary: The PyPA recommended tool for installing Python packages.
Home-page: https://pip.pypa.io/
Author: The pip developers
Author-email: pypa-dev@groups.google.com
License: MIT
Project-URL: Documentation, https://pip.pypa.io
Project-URL: Source, https://github.com/pypa/pip
Keywords: distutils easy_install egg setuptools wheel virtualenv
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Topic :: Software Development :: Build Tools
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Requires-Python: >=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*
pip - The Python Package Installer
==================================
.. image:: https://img.shields.io/pypi/v/pip.svg
:target: https://pypi.org/project/pip/
.. image:: https://readthedocs.org/projects/pip/badge/?version=latest
:target: https://pip.pypa.io/en/latest
pip is the `package installer`_ for Python. You can use pip to install packages from the `Python Package Index`_ and other indexes.
Please take a look at our documentation for how to install and use pip:
* `Installation`_
* `Usage`_
Updates are released regularly, with a new version every 3 months. More details can be found in our documentation:
* `Release notes`_
* `Release process`_
If you find bugs, need help, or want to talk to the developers please use our mailing lists or chat rooms:
* `Issue tracking`_
* `Discourse channel`_
* `User IRC`_
If you want to get involved head over to GitHub to get the source code, look at our development documentation and feel free to jump on the developer mailing lists and chat rooms:
* `GitHub page`_
* `Dev documentation`_
* `Dev mailing list`_
* `Dev IRC`_
Code of Conduct
---------------
Everyone interacting in the pip project's codebases, issue trackers, chat
rooms, and mailing lists is expected to follow the `PyPA Code of Conduct`_.
.. _package installer: https://packaging.python.org/guides/tool-recommendations/
.. _Python Package Index: https://pypi.org
.. _Installation: https://pip.pypa.io/en/stable/installing.html
.. _Usage: https://pip.pypa.io/en/stable/
.. _Release notes: https://pip.pypa.io/en/stable/news.html
.. _Release process: https://pip.pypa.io/en/latest/development/release-process/
.. _GitHub page: https://github.com/pypa/pip
.. _Dev documentation: https://pip.pypa.io/en/latest/development
.. _Issue tracking: https://github.com/pypa/pip/issues
.. _Discourse channel: https://discuss.python.org/c/packaging
.. _Dev mailing list: https://groups.google.com/forum/#!forum/pypa-dev
.. _User IRC: https://webchat.freenode.net/?channels=%23pypa
.. _Dev IRC: https://webchat.freenode.net/?channels=%23pypa-dev
.. _PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/
Wheel-Version: 1.0
Generator: bdist_wheel (0.33.6)
Root-Is-Purelib: true
Tag: py2-none-any
Tag: py3-none-any
[console_scripts]
pip = pip._internal.cli.main:main
pip3 = pip._internal.cli.main:main
pip3.8 = pip._internal.cli.main:main
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from typing import List, Optional
__version__ = "20.0.2"
def main(args=None):
# type: (Optional[List[str]]) -> int
"""This is an internal API only meant for use by pip's own console scripts.
For additional details, see https://github.com/pypa/pip/issues/7498.
"""
from pip._internal.utils.entrypoints import _wrapper
return _wrapper(args)
from __future__ import absolute_import
import os
import sys
# If we are running from a wheel, add the wheel to sys.path
# This allows the usage python pip-*.whl/pip install pip-*.whl
if __package__ == '':
# __file__ is pip-*.whl/pip/__main__.py
# first dirname call strips of '/__main__.py', second strips off '/pip'
# Resulting path is the name of the wheel itself
# Add that to sys.path so we can import pip
path = os.path.dirname(os.path.dirname(__file__))
sys.path.insert(0, path)
from pip._internal.cli.main import main as _main # isort:skip # noqa
if __name__ == '__main__':
sys.exit(_main())
#!/usr/bin/env python
import pip._internal.utils.inject_securetransport # noqa
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from typing import Optional, List
def main(args=None):
# type: (Optional[List[str]]) -> int
"""This is preserved for old console scripts that may still be referencing
it.
For additional details, see https://github.com/pypa/pip/issues/7498.
"""
from pip._internal.utils.entrypoints import _wrapper
return _wrapper(args)
"""Build Environment used for isolation during sdist building
"""
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
import logging
import os
import sys
import textwrap
from collections import OrderedDict
from distutils.sysconfig import get_python_lib
from sysconfig import get_paths
from pip._vendor.pkg_resources import Requirement, VersionConflict, WorkingSet
from pip import __file__ as pip_location
from pip._internal.utils.subprocess import call_subprocess
from pip._internal.utils.temp_dir import TempDirectory
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.ui import open_spinner
if MYPY_CHECK_RUNNING:
from typing import Tuple, Set, Iterable, Optional, List
from pip._internal.index.package_finder import PackageFinder
logger = logging.getLogger(__name__)
class _Prefix:
def __init__(self, path):
# type: (str) -> None
self.path = path
self.setup = False
self.bin_dir = get_paths(
'nt' if os.name == 'nt' else 'posix_prefix',
vars={'base': path, 'platbase': path}
)['scripts']
# Note: prefer distutils' sysconfig to get the
# library paths so PyPy is correctly supported.
purelib = get_python_lib(plat_specific=False, prefix=path)
platlib = get_python_lib(plat_specific=True, prefix=path)
if purelib == platlib:
self.lib_dirs = [purelib]
else:
self.lib_dirs = [purelib, platlib]
class BuildEnvironment(object):
"""Creates and manages an isolated environment to install build deps
"""
def __init__(self):
# type: () -> None
self._temp_dir = TempDirectory(kind="build-env")
self._prefixes = OrderedDict((
(name, _Prefix(os.path.join(self._temp_dir.path, name)))
for name in ('normal', 'overlay')
))
self._bin_dirs = [] # type: List[str]
self._lib_dirs = [] # type: List[str]
for prefix in reversed(list(self._prefixes.values())):
self._bin_dirs.append(prefix.bin_dir)
self._lib_dirs.extend(prefix.lib_dirs)
# Customize site to:
# - ensure .pth files are honored
# - prevent access to system site packages
system_sites = {
os.path.normcase(site) for site in (
get_python_lib(plat_specific=False),
get_python_lib(plat_specific=True),
)
}
self._site_dir = os.path.join(self._temp_dir.path, 'site')
if not os.path.exists(self._site_dir):
os.mkdir(self._site_dir)
with open(os.path.join(self._site_dir, 'sitecustomize.py'), 'w') as fp:
fp.write(textwrap.dedent(
'''
import os, site, sys
# First, drop system-sites related paths.
original_sys_path = sys.path[:]
known_paths = set()
for path in {system_sites!r}:
site.addsitedir(path, known_paths=known_paths)
system_paths = set(
os.path.normcase(path)
for path in sys.path[len(original_sys_path):]
)
original_sys_path = [
path for path in original_sys_path
if os.path.normcase(path) not in system_paths
]
sys.path = original_sys_path
# Second, add lib directories.
# ensuring .pth file are processed.
for path in {lib_dirs!r}:
assert not path in sys.path
site.addsitedir(path)
'''
).format(system_sites=system_sites, lib_dirs=self._lib_dirs))
def __enter__(self):
self._save_env = {
name: os.environ.get(name, None)
for name in ('PATH', 'PYTHONNOUSERSITE', 'PYTHONPATH')
}
path = self._bin_dirs[:]
old_path = self._save_env['PATH']
if old_path:
path.extend(old_path.split(os.pathsep))
pythonpath = [self._site_dir]
os.environ.update({
'PATH': os.pathsep.join(path),
'PYTHONNOUSERSITE': '1',
'PYTHONPATH': os.pathsep.join(pythonpath),
})
def __exit__(self, exc_type, exc_val, exc_tb):
for varname, old_value in self._save_env.items():
if old_value is None:
os.environ.pop(varname, None)
else:
os.environ[varname] = old_value
def cleanup(self):
# type: () -> None
self._temp_dir.cleanup()
def check_requirements(self, reqs):
# type: (Iterable[str]) -> Tuple[Set[Tuple[str, str]], Set[str]]
"""Return 2 sets:
- conflicting requirements: set of (installed, wanted) reqs tuples
- missing requirements: set of reqs
"""
missing = set()
conflicting = set()
if reqs:
ws = WorkingSet(self._lib_dirs)
for req in reqs:
try:
if ws.find(Requirement.parse(req)) is None:
missing.add(req)
except VersionConflict as e:
conflicting.add((str(e.args[0].as_requirement()),
str(e.args[1])))
return conflicting, missing
def install_requirements(
self,
finder, # type: PackageFinder
requirements, # type: Iterable[str]
prefix_as_string, # type: str
message # type: Optional[str]
):
# type: (...) -> None
prefix = self._prefixes[prefix_as_string]
assert not prefix.setup
prefix.setup = True
if not requirements:
return
args = [
sys.executable, os.path.dirname(pip_location), 'install',
'--ignore-installed', '--no-user', '--prefix', prefix.path,
'--no-warn-script-location',
] # type: List[str]
if logger.getEffectiveLevel() <= logging.DEBUG:
args.append('-v')
for format_control in ('no_binary', 'only_binary'):
formats = getattr(finder.format_control, format_control)
args.extend(('--' + format_control.replace('_', '-'),
','.join(sorted(formats or {':none:'}))))
index_urls = finder.index_urls
if index_urls:
args.extend(['-i', index_urls[0]])
for extra_index in index_urls[1:]:
args.extend(['--extra-index-url', extra_index])
else:
args.append('--no-index')
for link in finder.find_links:
args.extend(['--find-links', link])
for host in finder.trusted_hosts:
args.extend(['--trusted-host', host])
if finder.allow_all_prereleases:
args.append('--pre')
args.append('--')
args.extend(requirements)
with open_spinner(message) as spinner:
call_subprocess(args, spinner=spinner)
class NoOpBuildEnvironment(BuildEnvironment):
"""A no-op drop-in replacement for BuildEnvironment
"""
def __init__(self):
pass
def __enter__(self):
pass
def __exit__(self, exc_type, exc_val, exc_tb):
pass
def cleanup(self):
pass
def install_requirements(self, finder, requirements, prefix, message):
raise NotImplementedError()
"""Subpackage containing all of pip's command line interface related code
"""
# This file intentionally does not import submodules
"""Logic that powers autocompletion installed by ``pip completion``.
"""
import optparse
import os
import sys
from itertools import chain
from pip._internal.cli.main_parser import create_main_parser
from pip._internal.commands import commands_dict, create_command
from pip._internal.utils.misc import get_installed_distributions
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from typing import Any, Iterable, List, Optional
def autocomplete():
# type: () -> None
"""Entry Point for completion of main and subcommand options.
"""
# Don't complete if user hasn't sourced bash_completion file.
if 'PIP_AUTO_COMPLETE' not in os.environ:
return
cwords = os.environ['COMP_WORDS'].split()[1:]
cword = int(os.environ['COMP_CWORD'])
try:
current = cwords[cword - 1]
except IndexError:
current = ''
parser = create_main_parser()
subcommands = list(commands_dict)
options = []
# subcommand
subcommand_name = None # type: Optional[str]
for word in cwords:
if word in subcommands:
subcommand_name = word
break
# subcommand options
if subcommand_name is not None:
# special case: 'help' subcommand has no options
if subcommand_name == 'help':
sys.exit(1)
# special case: list locally installed dists for show and uninstall
should_list_installed = (
subcommand_name in ['show', 'uninstall'] and
not current.startswith('-')
)
if should_list_installed:
installed = []
lc = current.lower()
for dist in get_installed_distributions(local_only=True):
if dist.key.startswith(lc) and dist.key not in cwords[1:]:
installed.append(dist.key)
# if there are no dists installed, fall back to option completion
if installed:
for dist in installed:
print(dist)
sys.exit(1)
subcommand = create_command(subcommand_name)
for opt in subcommand.parser.option_list_all:
if opt.help != optparse.SUPPRESS_HELP:
for opt_str in opt._long_opts + opt._short_opts:
options.append((opt_str, opt.nargs))
# filter out previously specified options from available options
prev_opts = [x.split('=')[0] for x in cwords[1:cword - 1]]
options = [(x, v) for (x, v) in options if x not in prev_opts]
# filter options by current input
options = [(k, v) for k, v in options if k.startswith(current)]
# get completion type given cwords and available subcommand options
completion_type = get_path_completion_type(
cwords, cword, subcommand.parser.option_list_all,
)
# get completion files and directories if ``completion_type`` is
# ``<file>``, ``<dir>`` or ``<path>``
if completion_type:
paths = auto_complete_paths(current, completion_type)
options = [(path, 0) for path in paths]
for option in options:
opt_label = option[0]
# append '=' to options which require args
if option[1] and option[0][:2] == "--":
opt_label += '='
print(opt_label)
else:
# show main parser options only when necessary
opts = [i.option_list for i in parser.option_groups]
opts.append(parser.option_list)
flattened_opts = chain.from_iterable(opts)
if current.startswith('-'):
for opt in flattened_opts:
if opt.help != optparse.SUPPRESS_HELP:
subcommands += opt._long_opts + opt._short_opts
else:
# get completion type given cwords and all available options
completion_type = get_path_completion_type(cwords, cword,
flattened_opts)
if completion_type:
subcommands = list(auto_complete_paths(current,
completion_type))
print(' '.join([x for x in subcommands if x.startswith(current)]))
sys.exit(1)
def get_path_completion_type(cwords, cword, opts):
# type: (List[str], int, Iterable[Any]) -> Optional[str]
"""Get the type of path completion (``file``, ``dir``, ``path`` or None)
:param cwords: same as the environmental variable ``COMP_WORDS``
:param cword: same as the environmental variable ``COMP_CWORD``
:param opts: The available options to check
:return: path completion type (``file``, ``dir``, ``path`` or None)
"""
if cword < 2 or not cwords[cword - 2].startswith('-'):
return None
for opt in opts:
if opt.help == optparse.SUPPRESS_HELP:
continue
for o in str(opt).split('/'):
if cwords[cword - 2].split('=')[0] == o:
if not opt.metavar or any(
x in ('path', 'file', 'dir')
for x in opt.metavar.split('/')):
return opt.metavar
return None
def auto_complete_paths(current, completion_type):
# type: (str, str) -> Iterable[str]
"""If ``completion_type`` is ``file`` or ``path``, list all regular files
and directories starting with ``current``; otherwise only list directories
starting with ``current``.
:param current: The word to be completed
:param completion_type: path completion type(`file`, `path` or `dir`)i
:return: A generator of regular files and/or directories
"""
directory, filename = os.path.split(current)
current_path = os.path.abspath(directory)
# Don't complete paths if they can't be accessed
if not os.access(current_path, os.R_OK):
return
filename = os.path.normcase(filename)
# list all files that start with ``filename``
file_list = (x for x in os.listdir(current_path)
if os.path.normcase(x).startswith(filename))
for f in file_list:
opt = os.path.join(current_path, f)
comp_file = os.path.normcase(os.path.join(directory, f))
# complete regular files when there is not ``<dir>`` after option
# complete directories when there is ``<file>``, ``<path>`` or
# ``<dir>``after option
if completion_type != 'dir' and os.path.isfile(opt):
yield comp_file
elif os.path.isdir(opt):
yield os.path.join(comp_file, '')
"""Base Command class, and related routines"""
from __future__ import absolute_import, print_function
import logging
import logging.config
import optparse
import os
import platform
import sys
import traceback
from pip._internal.cli import cmdoptions
from pip._internal.cli.command_context import CommandContextMixIn
from pip._internal.cli.parser import (
ConfigOptionParser,
UpdatingDefaultsHelpFormatter,
)
from pip._internal.cli.status_codes import (
ERROR,
PREVIOUS_BUILD_DIR_ERROR,
SUCCESS,
UNKNOWN_ERROR,
VIRTUALENV_NOT_FOUND,
)
from pip._internal.exceptions import (
BadCommand,
CommandError,
InstallationError,
PreviousBuildDirError,
UninstallationError,
)
from pip._internal.utils.deprecation import deprecated
from pip._internal.utils.filesystem import check_path_owner
from pip._internal.utils.logging import BrokenStdoutLoggingError, setup_logging
from pip._internal.utils.misc import get_prog, normalize_path
from pip._internal.utils.temp_dir import global_tempdir_manager
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.virtualenv import running_under_virtualenv
if MYPY_CHECK_RUNNING:
from typing import List, Tuple, Any
from optparse import Values
__all__ = ['Command']
logger = logging.getLogger(__name__)
class Command(CommandContextMixIn):
usage = None # type: str
ignore_require_venv = False # type: bool
def __init__(self, name, summary, isolated=False):
# type: (str, str, bool) -> None
super(Command, self).__init__()
parser_kw = {
'usage': self.usage,
'prog': '%s %s' % (get_prog(), name),
'formatter': UpdatingDefaultsHelpFormatter(),
'add_help_option': False,
'name': name,
'description': self.__doc__,
'isolated': isolated,
}
self.name = name
self.summary = summary
self.parser = ConfigOptionParser(**parser_kw)
# Commands should add options to this option group
optgroup_name = '%s Options' % self.name.capitalize()
self.cmd_opts = optparse.OptionGroup(self.parser, optgroup_name)
# Add the general options
gen_opts = cmdoptions.make_option_group(
cmdoptions.general_group,
self.parser,
)
self.parser.add_option_group(gen_opts)
def handle_pip_version_check(self, options):
# type: (Values) -> None
"""
This is a no-op so that commands by default do not do the pip version
check.
"""
# Make sure we do the pip version check if the index_group options
# are present.
assert not hasattr(options, 'no_index')
def run(self, options, args):
# type: (Values, List[Any]) -> Any
raise NotImplementedError
def parse_args(self, args):
# type: (List[str]) -> Tuple[Any, Any]
# factored out for testability
return self.parser.parse_args(args)
def main(self, args):
# type: (List[str]) -> int
try:
with self.main_context():
return self._main(args)
finally:
logging.shutdown()
def _main(self, args):
# type: (List[str]) -> int
# Intentionally set as early as possible so globally-managed temporary
# directories are available to the rest of the code.
self.enter_context(global_tempdir_manager())
options, args = self.parse_args(args)
# Set verbosity so that it can be used elsewhere.
self.verbosity = options.verbose - options.quiet
level_number = setup_logging(
verbosity=self.verbosity,
no_color=options.no_color,
user_log_file=options.log,
)
if (
sys.version_info[:2] == (2, 7) and
not options.no_python_version_warning
):
message = (
"A future version of pip will drop support for Python 2.7. "
"More details about Python 2 support in pip, can be found at "
"https://pip.pypa.io/en/latest/development/release-process/#python-2-support" # noqa
)
if platform.python_implementation() == "CPython":
message = (
"Python 2.7 reached the end of its life on January "
"1st, 2020. Please upgrade your Python as Python 2.7 "
"is no longer maintained. "
) + message
deprecated(message, replacement=None, gone_in=None)
if options.skip_requirements_regex:
deprecated(
"--skip-requirements-regex is unsupported and will be removed",
replacement=(
"manage requirements/constraints files explicitly, "
"possibly generating them from metadata"
),
gone_in="20.1",
issue=7297,
)
# TODO: Try to get these passing down from the command?
# without resorting to os.environ to hold these.
# This also affects isolated builds and it should.
if options.no_input:
os.environ['PIP_NO_INPUT'] = '1'
if options.exists_action:
os.environ['PIP_EXISTS_ACTION'] = ' '.join(options.exists_action)
if options.require_venv and not self.ignore_require_venv:
# If a venv is required check if it can really be found
if not running_under_virtualenv():
logger.critical(
'Could not find an activated virtualenv (required).'
)
sys.exit(VIRTUALENV_NOT_FOUND)
if options.cache_dir:
options.cache_dir = normalize_path(options.cache_dir)
if not check_path_owner(options.cache_dir):
logger.warning(
"The directory '%s' or its parent directory is not owned "
"or is not writable by the current user. The cache "
"has been disabled. Check the permissions and owner of "
"that directory. If executing pip with sudo, you may want "
"sudo's -H flag.",
options.cache_dir,
)
options.cache_dir = None
try:
status = self.run(options, args)
# FIXME: all commands should return an exit status
# and when it is done, isinstance is not needed anymore
if isinstance(status, int):
return status
except PreviousBuildDirError as exc:
logger.critical(str(exc))
logger.debug('Exception information:', exc_info=True)
return PREVIOUS_BUILD_DIR_ERROR
except (InstallationError, UninstallationError, BadCommand) as exc:
logger.critical(str(exc))
logger.debug('Exception information:', exc_info=True)
return ERROR
except CommandError as exc:
logger.critical('%s', exc)
logger.debug('Exception information:', exc_info=True)
return ERROR
except BrokenStdoutLoggingError:
# Bypass our logger and write any remaining messages to stderr
# because stdout no longer works.
print('ERROR: Pipe to stdout was broken', file=sys.stderr)
if level_number <= logging.DEBUG:
traceback.print_exc(file=sys.stderr)
return ERROR
except KeyboardInterrupt:
logger.critical('Operation cancelled by user')
logger.debug('Exception information:', exc_info=True)
return ERROR
except BaseException:
logger.critical('Exception:', exc_info=True)
return UNKNOWN_ERROR
finally:
self.handle_pip_version_check(options)
return SUCCESS
from contextlib import contextmanager
from pip._vendor.contextlib2 import ExitStack
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from typing import Iterator, ContextManager, TypeVar
_T = TypeVar('_T', covariant=True)
class CommandContextMixIn(object):
def __init__(self):
# type: () -> None
super(CommandContextMixIn, self).__init__()
self._in_main_context = False
self._main_context = ExitStack()
@contextmanager
def main_context(self):
# type: () -> Iterator[None]
assert not self._in_main_context
self._in_main_context = True
try:
with self._main_context:
yield
finally:
self._in_main_context = False
def enter_context(self, context_provider):
# type: (ContextManager[_T]) -> _T
assert self._in_main_context
return self._main_context.enter_context(context_provider)
"""Primary application entrypoint.
"""
from __future__ import absolute_import
import locale
import logging
import os
import sys
from pip._internal.cli.autocompletion import autocomplete
from pip._internal.cli.main_parser import parse_command
from pip._internal.commands import create_command
from pip._internal.exceptions import PipError
from pip._internal.utils import deprecation
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from typing import List, Optional
logger = logging.getLogger(__name__)
# Do not import and use main() directly! Using it directly is actively
# discouraged by pip's maintainers. The name, location and behavior of
# this function is subject to change, so calling it directly is not
# portable across different pip versions.
# In addition, running pip in-process is unsupported and unsafe. This is
# elaborated in detail at
# https://pip.pypa.io/en/stable/user_guide/#using-pip-from-your-program.
# That document also provides suggestions that should work for nearly
# all users that are considering importing and using main() directly.
# However, we know that certain users will still want to invoke pip
# in-process. If you understand and accept the implications of using pip
# in an unsupported manner, the best approach is to use runpy to avoid
# depending on the exact location of this entry point.
# The following example shows how to use runpy to invoke pip in that
# case:
#
# sys.argv = ["pip", your, args, here]
# runpy.run_module("pip", run_name="__main__")
#
# Note that this will exit the process after running, unlike a direct
# call to main. As it is not safe to do any processing after calling
# main, this should not be an issue in practice.
def main(args=None):
# type: (Optional[List[str]]) -> int
if args is None:
args = sys.argv[1:]
# Configure our deprecation warnings to be sent through loggers
deprecation.install_warning_logger()
autocomplete()
try:
cmd_name, cmd_args = parse_command(args)
except PipError as exc:
sys.stderr.write("ERROR: %s" % exc)
sys.stderr.write(os.linesep)
sys.exit(1)
# Needed for locale.getpreferredencoding(False) to work
# in pip._internal.utils.encoding.auto_decode
try:
locale.setlocale(locale.LC_ALL, '')
except locale.Error as e:
# setlocale can apparently crash if locale are uninitialized
logger.debug("Ignoring error %s when setting locale", e)
command = create_command(cmd_name, isolated=("--isolated" in cmd_args))
return command.main(cmd_args)
"""A single place for constructing and exposing the main parser
"""
import os
import sys
from pip._internal.cli import cmdoptions
from pip._internal.cli.parser import (
ConfigOptionParser,
UpdatingDefaultsHelpFormatter,
)
from pip._internal.commands import commands_dict, get_similar_commands
from pip._internal.exceptions import CommandError
from pip._internal.utils.misc import get_pip_version, get_prog
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from typing import Tuple, List
__all__ = ["create_main_parser", "parse_command"]
def create_main_parser():
# type: () -> ConfigOptionParser
"""Creates and returns the main parser for pip's CLI
"""
parser_kw = {
'usage': '\n%prog <command> [options]',
'add_help_option': False,
'formatter': UpdatingDefaultsHelpFormatter(),
'name': 'global',
'prog': get_prog(),
}
parser = ConfigOptionParser(**parser_kw)
parser.disable_interspersed_args()
parser.version = get_pip_version()
# add the general options
gen_opts = cmdoptions.make_option_group(cmdoptions.general_group, parser)
parser.add_option_group(gen_opts)
# so the help formatter knows
parser.main = True # type: ignore
# create command listing for description
description = [''] + [
'%-27s %s' % (name, command_info.summary)
for name, command_info in commands_dict.items()
]
parser.description = '\n'.join(description)
return parser
def parse_command(args):
# type: (List[str]) -> Tuple[str, List[str]]
parser = create_main_parser()
# Note: parser calls disable_interspersed_args(), so the result of this
# call is to split the initial args into the general options before the
# subcommand and everything else.
# For example:
# args: ['--timeout=5', 'install', '--user', 'INITools']
# general_options: ['--timeout==5']
# args_else: ['install', '--user', 'INITools']
general_options, args_else = parser.parse_args(args)
# --version
if general_options.version:
sys.stdout.write(parser.version) # type: ignore
sys.stdout.write(os.linesep)
sys.exit()
# pip || pip help -> print_help()
if not args_else or (args_else[0] == 'help' and len(args_else) == 1):
parser.print_help()
sys.exit()
# the subcommand name
cmd_name = args_else[0]
if cmd_name not in commands_dict:
guess = get_similar_commands(cmd_name)
msg = ['unknown command "%s"' % cmd_name]
if guess:
msg.append('maybe you meant "%s"' % guess)
raise CommandError(' - '.join(msg))
# all the args without the subcommand
cmd_args = args[:]
cmd_args.remove(cmd_name)
return cmd_name, cmd_args
"""Base option parser setup"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import logging
import optparse
import sys
import textwrap
from distutils.util import strtobool
from pip._vendor.six import string_types
from pip._internal.cli.status_codes import UNKNOWN_ERROR
from pip._internal.configuration import Configuration, ConfigurationError
from pip._internal.utils.compat import get_terminal_size
logger = logging.getLogger(__name__)
class PrettyHelpFormatter(optparse.IndentedHelpFormatter):
"""A prettier/less verbose help formatter for optparse."""
def __init__(self, *args, **kwargs):
# help position must be aligned with __init__.parseopts.description
kwargs['max_help_position'] = 30
kwargs['indent_increment'] = 1
kwargs['width'] = get_terminal_size()[0] - 2
optparse.IndentedHelpFormatter.__init__(self, *args, **kwargs)
def format_option_strings(self, option):
return self._format_option_strings(option, ' <%s>', ', ')
def _format_option_strings(self, option, mvarfmt=' <%s>', optsep=', '):
"""
Return a comma-separated list of option strings and metavars.
:param option: tuple of (short opt, long opt), e.g: ('-f', '--format')
:param mvarfmt: metavar format string - evaluated as mvarfmt % metavar
:param optsep: separator
"""
opts = []
if option._short_opts:
opts.append(option._short_opts[0])
if option._long_opts:
opts.append(option._long_opts[0])
if len(opts) > 1:
opts.insert(1, optsep)
if option.takes_value():
metavar = option.metavar or option.dest.lower()
opts.append(mvarfmt % metavar.lower())
return ''.join(opts)
def format_heading(self, heading):
if heading == 'Options':
return ''
return heading + ':\n'
def format_usage(self, usage):
"""
Ensure there is only one newline between usage and the first heading
if there is no description.
"""
msg = '\nUsage: %s\n' % self.indent_lines(textwrap.dedent(usage), " ")
return msg
def format_description(self, description):
# leave full control over description to us
if description:
if hasattr(self.parser, 'main'):
label = 'Commands'
else:
label = 'Description'
# some doc strings have initial newlines, some don't
description = description.lstrip('\n')
# some doc strings have final newlines and spaces, some don't
description = description.rstrip()
# dedent, then reindent
description = self.indent_lines(textwrap.dedent(description), " ")
description = '%s:\n%s\n' % (label, description)
return description
else:
return ''
def format_epilog(self, epilog):
# leave full control over epilog to us
if epilog:
return epilog
else:
return ''
def indent_lines(self, text, indent):
new_lines = [indent + line for line in text.split('\n')]
return "\n".join(new_lines)
class UpdatingDefaultsHelpFormatter(PrettyHelpFormatter):
"""Custom help formatter for use in ConfigOptionParser.
This is updates the defaults before expanding them, allowing
them to show up correctly in the help listing.
"""
def expand_default(self, option):
if self.parser is not None:
self.parser._update_defaults(self.parser.defaults)
return optparse.IndentedHelpFormatter.expand_default(self, option)
class CustomOptionParser(optparse.OptionParser):
def insert_option_group(self, idx, *args, **kwargs):
"""Insert an OptionGroup at a given position."""
group = self.add_option_group(*args, **kwargs)
self.option_groups.pop()
self.option_groups.insert(idx, group)
return group
@property
def option_list_all(self):
"""Get a list of all options, including those in option groups."""
res = self.option_list[:]
for i in self.option_groups:
res.extend(i.option_list)
return res
class ConfigOptionParser(CustomOptionParser):
"""Custom option parser which updates its defaults by checking the
configuration files and environmental variables"""
def __init__(self, *args, **kwargs):
self.name = kwargs.pop('name')
isolated = kwargs.pop("isolated", False)
self.config = Configuration(isolated)
assert self.name
optparse.OptionParser.__init__(self, *args, **kwargs)
def check_default(self, option, key, val):
try:
return option.check_value(key, val)
except optparse.OptionValueError as exc:
print("An error occurred during configuration: %s" % exc)
sys.exit(3)
def _get_ordered_configuration_items(self):
# Configuration gives keys in an unordered manner. Order them.
override_order = ["global", self.name, ":env:"]
# Pool the options into different groups
section_items = {name: [] for name in override_order}
for section_key, val in self.config.items():
# ignore empty values
if not val:
logger.debug(
"Ignoring configuration key '%s' as it's value is empty.",
section_key
)
continue
section, key = section_key.split(".", 1)
if section in override_order:
section_items[section].append((key, val))
# Yield each group in their override order
for section in override_order:
for key, val in section_items[section]:
yield key, val
def _update_defaults(self, defaults):
"""Updates the given defaults with values from the config files and
the environ. Does a little special handling for certain types of
options (lists)."""
# Accumulate complex default state.
self.values = optparse.Values(self.defaults)
late_eval = set()
# Then set the options with those values
for key, val in self._get_ordered_configuration_items():
# '--' because configuration supports only long names
option = self.get_option('--' + key)
# Ignore options not present in this parser. E.g. non-globals put
# in [global] by users that want them to apply to all applicable
# commands.
if option is None:
continue
if option.action in ('store_true', 'store_false', 'count'):
try:
val = strtobool(val)
except ValueError:
error_msg = invalid_config_error_message(
option.action, key, val
)
self.error(error_msg)
elif option.action == 'append':
val = val.split()
val = [self.check_default(option, key, v) for v in val]
elif option.action == 'callback':
late_eval.add(option.dest)
opt_str = option.get_opt_string()
val = option.convert_value(opt_str, val)
# From take_action
args = option.callback_args or ()
kwargs = option.callback_kwargs or {}
option.callback(option, opt_str, val, self, *args, **kwargs)
else:
val = self.check_default(option, key, val)
defaults[option.dest] = val
for key in late_eval:
defaults[key] = getattr(self.values, key)
self.values = None
return defaults
def get_default_values(self):
"""Overriding to make updating the defaults after instantiation of
the option parser possible, _update_defaults() does the dirty work."""
if not self.process_default_values:
# Old, pre-Optik 1.5 behaviour.
return optparse.Values(self.defaults)
# Load the configuration, or error out in case of an error
try:
self.config.load()
except ConfigurationError as err:
self.exit(UNKNOWN_ERROR, str(err))
defaults = self._update_defaults(self.defaults.copy()) # ours
for option in self._get_all_options():
default = defaults.get(option.dest)
if isinstance(default, string_types):
opt_str = option.get_opt_string()
defaults[option.dest] = option.check_value(opt_str, default)
return optparse.Values(defaults)
def error(self, msg):
self.print_usage(sys.stderr)
self.exit(UNKNOWN_ERROR, "%s\n" % msg)
def invalid_config_error_message(action, key, val):
"""Returns a better error message when invalid configuration option
is provided."""
if action in ('store_true', 'store_false'):
return ("{0} is not a valid value for {1} option, "
"please specify a boolean value like yes/no, "
"true/false or 1/0 instead.").format(val, key)
return ("{0} is not a valid value for {1} option, "
"please specify a numerical value like 1/0 "
"instead.").format(val, key)
from __future__ import absolute_import
SUCCESS = 0
ERROR = 1
UNKNOWN_ERROR = 2
VIRTUALENV_NOT_FOUND = 3
PREVIOUS_BUILD_DIR_ERROR = 4
NO_MATCHES_FOUND = 23
"""
Package containing all pip commands
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import importlib
from collections import OrderedDict, namedtuple
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from typing import Any
from pip._internal.cli.base_command import Command
CommandInfo = namedtuple('CommandInfo', 'module_path, class_name, summary')
# The ordering matters for help display.
# Also, even though the module path starts with the same
# "pip._internal.commands" prefix in each case, we include the full path
# because it makes testing easier (specifically when modifying commands_dict
# in test setup / teardown by adding info for a FakeCommand class defined
# in a test-related module).
# Finally, we need to pass an iterable of pairs here rather than a dict
# so that the ordering won't be lost when using Python 2.7.
commands_dict = OrderedDict([
('install', CommandInfo(
'pip._internal.commands.install', 'InstallCommand',
'Install packages.',
)),
('download', CommandInfo(
'pip._internal.commands.download', 'DownloadCommand',
'Download packages.',
)),
('uninstall', CommandInfo(
'pip._internal.commands.uninstall', 'UninstallCommand',
'Uninstall packages.',
)),
('freeze', CommandInfo(
'pip._internal.commands.freeze', 'FreezeCommand',
'Output installed packages in requirements format.',
)),
('list', CommandInfo(
'pip._internal.commands.list', 'ListCommand',
'List installed packages.',
)),
('show', CommandInfo(
'pip._internal.commands.show', 'ShowCommand',
'Show information about installed packages.',
)),
('check', CommandInfo(
'pip._internal.commands.check', 'CheckCommand',
'Verify installed packages have compatible dependencies.',
)),
('config', CommandInfo(
'pip._internal.commands.configuration', 'ConfigurationCommand',
'Manage local and global configuration.',
)),
('search', CommandInfo(
'pip._internal.commands.search', 'SearchCommand',
'Search PyPI for packages.',
)),
('wheel', CommandInfo(
'pip._internal.commands.wheel', 'WheelCommand',
'Build wheels from your requirements.',
)),
('hash', CommandInfo(
'pip._internal.commands.hash', 'HashCommand',
'Compute hashes of package archives.',
)),
('completion', CommandInfo(
'pip._internal.commands.completion', 'CompletionCommand',
'A helper command used for command completion.',
)),
('debug', CommandInfo(
'pip._internal.commands.debug', 'DebugCommand',
'Show information useful for debugging.',
)),
('help', CommandInfo(
'pip._internal.commands.help', 'HelpCommand',
'Show help for commands.',
)),
]) # type: OrderedDict[str, CommandInfo]
def create_command(name, **kwargs):
# type: (str, **Any) -> Command
"""
Create an instance of the Command class with the given name.
"""
module_path, class_name, summary = commands_dict[name]
module = importlib.import_module(module_path)
command_class = getattr(module, class_name)
command = command_class(name=name, summary=summary, **kwargs)
return command
def get_similar_commands(name):
"""Command name auto-correct."""
from difflib import get_close_matches
name = name.lower()
close_commands = get_close_matches(name, commands_dict.keys())
if close_commands:
return close_commands[0]
else:
return False
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import logging
from pip._internal.cli.base_command import Command
from pip._internal.operations.check import (
check_package_set,
create_package_set_from_installed,
)
from pip._internal.utils.misc import write_output
logger = logging.getLogger(__name__)
class CheckCommand(Command):
"""Verify installed packages have compatible dependencies."""
usage = """
%prog [options]"""
def run(self, options, args):
package_set, parsing_probs = create_package_set_from_installed()
missing, conflicting = check_package_set(package_set)
for project_name in missing:
version = package_set[project_name].version
for dependency in missing[project_name]:
write_output(
"%s %s requires %s, which is not installed.",
project_name, version, dependency[0],
)
for project_name in conflicting:
version = package_set[project_name].version
for dep_name, dep_version, req in conflicting[project_name]:
write_output(
"%s %s has requirement %s, but you have %s %s.",
project_name, version, req, dep_name, dep_version,
)
if missing or conflicting or parsing_probs:
return 1
else:
write_output("No broken requirements found.")
from pip._internal.distributions.sdist import SourceDistribution
from pip._internal.distributions.wheel import WheelDistribution
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from pip._internal.distributions.base import AbstractDistribution
from pip._internal.req.req_install import InstallRequirement
def make_distribution_for_install_requirement(install_req):
# type: (InstallRequirement) -> AbstractDistribution
"""Returns a Distribution for the given InstallRequirement
"""
# Editable requirements will always be source distributions. They use the
# legacy logic until we create a modern standard for them.
if install_req.editable:
return SourceDistribution(install_req)
# If it's a wheel, it's a WheelDistribution
if install_req.is_wheel:
return WheelDistribution(install_req)
# Otherwise, a SourceDistribution
return SourceDistribution(install_req)
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment