Merge branch 'lbryum-refactor'

This commit is contained in:
Jack Robison 2018-08-24 15:04:35 -04:00
commit b101fafd39
No known key found for this signature in database
GPG key ID: DF25C68FE0239BB2
218 changed files with 5423 additions and 4822 deletions

30
.gitignore vendored
View file

@ -1,25 +1,9 @@
*.pyc
*.egg
*.so
*.xml
*.iml
*.log
*.pem
*.decTest
*.prof
.#*
/build
/dist
/.tox
/.idea
/.coverage
/build/build
/build/dist
/bulid/requirements_base.txt
/lbrynet.egg-info
/docs_build
/lbry-venv
.idea/
.coverage
.DS_Store
# temporary files from the twisted.trial test runner
lbrynet.egg-info
__pycache__
_trial_temp/

View file

@ -121,7 +121,11 @@ disable=
unidiomatic-typecheck,
global-at-module-level,
inconsistent-return-statements,
keyword-arg-before-vararg
keyword-arg-before-vararg,
assignment-from-no-return,
useless-return,
assignment-from-none,
stop-iteration-return
[REPORTS]
@ -386,7 +390,7 @@ int-import-graph=
[DESIGN]
# Maximum number of arguments for function / method
max-args=5
max-args=10
# Argument names that match this expression will be ignored. Default to name
# with leading underscore
@ -405,7 +409,7 @@ max-branches=12
max-statements=50
# Maximum number of parents for a class (see R0901).
max-parents=7
max-parents=8
# Maximum number of attributes for a class (see R0902).
max-attributes=7

View file

@ -1,42 +1,105 @@
os: linux
dist: trusty
sudo: required
dist: xenial
language: python
python: 2.7
python: "3.7"
branches:
except:
- gh-pages
jobs:
include:
- stage: code quality
name: "pylint lbrynet"
install:
- pip install pylint
- pip install git+https://github.com/lbryio/torba.git
- pip install git+https://github.com/lbryio/lbryschema.git
- pip install -e .
script: pylint lbrynet
- &tests
stage: test
name: "Unit Tests w/ Python 3.7"
install:
- pip install coverage
- pip install git+https://github.com/lbryio/torba.git
- pip install git+https://github.com/lbryio/lbryschema.git
- pip install -e .[test]
script: HOME=/tmp coverage run --source=lbrynet -m twisted.trial --reactor=asyncio tests.unit
after_success:
- bash <(curl -s https://codecov.io/bash)
- <<: *tests
name: "Unit Tests w/ Python 3.6"
python: "3.6"
- <<: *tests
name: "DHT Tests w/ Python 3.7"
script: HOME=/tmp coverage run --source=lbrynet -m twisted.trial --reactor=asyncio tests.functional
- <<: *tests
name: "DHT Tests w/ Python 3.6"
python: "3.6"
script: HOME=/tmp coverage run --source=lbrynet -m twisted.trial --reactor=asyncio tests.functional
- name: "Integration Tests"
install:
- pip install tox-travis coverage
- pushd .. && git clone https://github.com/lbryio/electrumx.git --branch lbryumx && popd
- pushd .. && git clone https://github.com/lbryio/orchstr8.git && popd
- pushd .. && git clone https://github.com/lbryio/lbryschema.git && popd
- pushd .. && git clone https://github.com/lbryio/lbryumx.git && cd lbryumx && git checkout afd34f323dd94c516108a65240f7d17aea8efe85 && cd .. && popd
- pushd .. && git clone https://github.com/lbryio/torba.git && popd
script: tox
after_success:
- coverage combine tests/
- bash <(curl -s https://codecov.io/bash)
- stage: build
name: "Windows"
language: generic
services:
- docker
install:
- docker pull cdrx/pyinstaller-windows:python3-32bit
script:
- docker run -v "$(pwd):/src/lbry" cdrx/pyinstaller-windows:python3-32bit lbry/scripts/wine_build.sh
addons:
artifacts:
working_dir: dist
paths:
- lbrynet.exe
target_paths:
- /daemon/build-${TRAVIS_BUILD_NUMBER}_commit-${TRAVIS_COMMIT:0:7}_branch-${TRAVIS_BRANCH}$([ ! -z ${TRAVIS_TAG} ] && echo _tag-${TRAVIS_TAG})/win/
- &build
name: "Linux"
python: "3.6"
install:
- pip3 install pyinstaller
- pip3 install git+https://github.com/lbryio/torba.git
- pip3 install git+https://github.com/lbryio/lbryschema.git
- pip3 install -e .
script:
- pyinstaller -F -n lbrynet lbrynet/cli.py
- ./dist/lbrynet --version
env: OS=linux
addons:
artifacts:
working_dir: dist
paths:
- lbrynet
# artifact uploader thinks lbrynet is a directory, https://github.com/travis-ci/artifacts/issues/78
target_paths:
- /daemon/build-${TRAVIS_BUILD_NUMBER}_commit-${TRAVIS_COMMIT:0:7}_branch-${TRAVIS_BRANCH}$([ ! -z ${TRAVIS_TAG} ] && echo _tag-${TRAVIS_TAG})/${OS}/lbrynet
- <<: *build
name: "Mac"
os: osx
osx_image: xcode9.4
language: generic
env: OS=mac
cache:
directories:
- $HOME/.cache/pip
- $HOME/Library/Caches/pip
- $TRAVIS_BUILD_DIR/cache/wheel
addons:
#srcclr:
# debug: true
apt:
packages:
- libgmp3-dev
- build-essential
- git
- libssl-dev
- libffi-dev
before_install:
- virtualenv venv
- source venv/bin/activate
install:
- pip install -U pip==9.0.3
- pip install -r requirements.txt
- pip install -r requirements_testing.txt
- pip install .
script:
- pip install mock pylint
- pylint lbrynet
- PYTHONPATH=. trial lbrynet.tests
- rvm install ruby-2.3.1
- rvm use 2.3.1 && gem install danger --version '~> 4.0' && danger
- $TRAVIS_BUILD_DIR/.tox

View file

@ -9,7 +9,7 @@ at anytime.
## [Unreleased]
### Security
*
* Upgraded `cryptography` package.
*
### Fixed
@ -21,15 +21,18 @@ at anytime.
*
### Changed
*
*
* Ported to Python 3 without backwards compatibility with Python 2.
* Switched to a brand new wallet implementation: torba.
* Format of wallet has changed to support multiple accounts in one wallet.
### Added
*
*
* `fund` command, used to move funds between or within an account in various ways.
* `max_address_gap` command, for finding large gaps of unused addresses
* `balance` command, a more detailed version `wallet_balace` which includes all accounts.
* `account` command, adding/deleting/modifying accounts including setting the default account.
### Removed
*
* `send_amount_to_address` command, which was previously marked as deprecated
*

View file

@ -1,33 +0,0 @@
$env:Path += ";C:\MinGW\bin\"
$env:Path += ";C:\Program Files (x86)\Windows Kits\10\bin\x86\"
gcc --version
mingw32-make --version
# build/install miniupnpc manually
tar zxf miniupnpc-1.9.tar.gz
cd miniupnpc-1.9
mingw32-make -f Makefile.mingw
python setupmingw32.py build --compiler=mingw32
python setupmingw32.py install
cd ..\
Remove-Item -Recurse -Force miniupnpc-1.9
# copy requirements from lbry, but remove miniupnpc (installed manually)
Get-Content ..\requirements.txt | Select-String -Pattern 'miniupnpc' -NotMatch | Out-File requirements_base.txt
python set_build.py
pip install -r requirements.txt
pip install ..\.
pyinstaller -y daemon.onefile.spec
pyinstaller -y cli.onefile.spec
pyinstaller -y console.onefile.spec
nuget install secure-file -ExcludeVersion
secure-file\tools\secure-file -decrypt .\lbry2.pfx.enc -secret "$env:pfx_key"
signtool.exe sign /f .\lbry2.pfx /p "$env:key_pass" /tr http://tsa.starfieldtech.com /td SHA256 /fd SHA256 dist\*.exe
python zip_daemon.py
python upload_assets.py

View file

@ -1,59 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
ROOT="$( cd "$( dirname "${BASH_SOURCE[0]}" )/.." && pwd )"
cd "$ROOT"
BUILD_DIR="$ROOT/build"
FULL_BUILD="${FULL_BUILD:-false}"
if [ -n "${TEAMCITY_VERSION:-}" -o -n "${APPVEYOR:-}" ]; then
FULL_BUILD="true"
fi
[ -d "$BUILD_DIR/bulid" ] && rm -rf "$BUILD_DIR/build"
[ -d "$BUILD_DIR/dist" ] && rm -rf "$BUILD_DIR/dist"
if [ "$FULL_BUILD" == "true" ]; then
# install dependencies
$BUILD_DIR/prebuild.sh
VENV="$BUILD_DIR/venv"
if [ -d "$VENV" ]; then
rm -rf "$VENV"
fi
virtualenv "$VENV"
set +u
source "$VENV/bin/activate"
set -u
# must set build before installing lbrynet. otherwise it has no effect
python "$BUILD_DIR/set_build.py"
fi
cp "$ROOT/requirements.txt" "$BUILD_DIR/requirements_base.txt"
(
cd "$BUILD_DIR"
pip install -r requirements.txt
)
(
cd "$BUILD_DIR"
pyinstaller -y daemon.onefile.spec
pyinstaller -y cli.onefile.spec
pyinstaller -y console.onefile.spec
)
python "$BUILD_DIR/zip_daemon.py"
if [ "$FULL_BUILD" == "true" ]; then
# electron-build has a publish feature, but I had a hard time getting
# it to reliably work and it also seemed difficult to configure. Not proud of
# this, but it seemed better to write my own.
python "$BUILD_DIR/upload_assets.py"
deactivate
fi
echo 'Build complete.'

View file

@ -1,42 +0,0 @@
# -*- mode: python -*-
import platform
import os
dir = 'build';
cwd = os.getcwd()
if os.path.basename(cwd) != dir:
raise Exception('pyinstaller build needs to be run from the ' + dir + ' directory')
repo_base = os.path.abspath(os.path.join(cwd, '..'))
execfile(os.path.join(cwd, "entrypoint.py")) # ghetto import
system = platform.system()
if system == 'Darwin':
icns = os.path.join(repo_base, 'build', 'icon.icns')
elif system == 'Linux':
icns = os.path.join(repo_base, 'build', 'icons', '256x256.png')
elif system == 'Windows':
icns = os.path.join(repo_base, 'build', 'icons', 'lbry256.ico')
else:
print 'Warning: System {} has no icons'.format(system)
icns = None
a = Entrypoint('lbrynet', 'console_scripts', 'lbrynet-cli', pathex=[cwd])
pyz = PYZ(a.pure, a.zipped_data)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.zipfiles,
a.datas,
name='lbrynet-cli',
debug=False,
strip=False,
upx=True,
console=True,
icon=icns
)

View file

@ -1,50 +0,0 @@
# -*- mode: python -*-
import platform
import os
import lbryum
dir = 'build';
cwd = os.getcwd()
if os.path.basename(cwd) != dir:
raise Exception('pyinstaller build needs to be run from the ' + dir + ' directory')
repo_base = os.path.abspath(os.path.join(cwd, '..'))
execfile(os.path.join(cwd, "entrypoint.py")) # ghetto import
system = platform.system()
if system == 'Darwin':
icns = os.path.join(repo_base, 'build', 'icon.icns')
elif system == 'Linux':
icns = os.path.join(repo_base, 'build', 'icons', '256x256.png')
elif system == 'Windows':
icns = os.path.join(repo_base, 'build', 'icons', 'lbry256.ico')
else:
print 'Warning: System {} has no icons'.format(system)
icns = None
datas = [
(os.path.join(os.path.dirname(lbryum.__file__), 'wordlist', language + '.txt'), 'lbryum/wordlist')
for language in ('chinese_simplified', 'japanese', 'spanish','english', 'portuguese')
]
a = Entrypoint('lbrynet', 'console_scripts', 'lbrynet-console', pathex=[cwd], datas=datas)
pyz = PYZ(a.pure, a.zipped_data)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.zipfiles,
a.datas,
name='lbrynet-console',
debug=False,
strip=False,
upx=True,
console=True,
icon=icns
)

View file

@ -1,50 +0,0 @@
# -*- mode: python -*-
import platform
import os
import lbryum
dir = 'build';
cwd = os.getcwd()
if os.path.basename(cwd) != dir:
raise Exception('pyinstaller build needs to be run from the ' + dir + ' directory')
repo_base = os.path.abspath(os.path.join(cwd, '..'))
execfile(os.path.join(cwd, "entrypoint.py")) # ghetto import
system = platform.system()
if system == 'Darwin':
icns = os.path.join(repo_base, 'build', 'icon.icns')
elif system == 'Linux':
icns = os.path.join(repo_base, 'build', 'icons', '256x256.png')
elif system == 'Windows':
icns = os.path.join(repo_base, 'build', 'icons', 'lbry256.ico')
else:
print 'Warning: System {} has no icons'.format(system)
icns = None
datas = [
(os.path.join(os.path.dirname(lbryum.__file__), 'wordlist', language + '.txt'), 'lbryum/wordlist')
for language in ('chinese_simplified', 'japanese', 'spanish','english', 'portuguese')
]
a = Entrypoint('lbrynet', 'console_scripts', 'lbrynet-daemon', pathex=[cwd], datas=datas)
pyz = PYZ(a.pure, a.zipped_data)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.zipfiles,
a.datas,
name='lbrynet-daemon',
debug=False,
strip=False,
upx=True,
console=True,
icon=icns
)

View file

@ -1,47 +0,0 @@
# https://github.com/pyinstaller/pyinstaller/wiki/Recipe-Setuptools-Entry-Point
def Entrypoint(dist, group, name,
scripts=None, pathex=None, binaries=None, datas=None,
hiddenimports=None, hookspath=None, excludes=None, runtime_hooks=None,
cipher=None, win_no_prefer_redirects=False, win_private_assemblies=False):
import pkg_resources
# get toplevel packages of distribution from metadata
def get_toplevel(dist):
distribution = pkg_resources.get_distribution(dist)
if distribution.has_metadata('top_level.txt'):
return list(distribution.get_metadata('top_level.txt').split())
else:
return []
hiddenimports = hiddenimports or []
packages = []
for distribution in hiddenimports:
packages += get_toplevel(distribution)
scripts = scripts or []
pathex = pathex or []
# get the entry point
ep = pkg_resources.get_entry_info(dist, group, name)
# insert path of the egg at the verify front of the search path
pathex = [ep.dist.location] + pathex
# script name must not be a valid module name to avoid name clashes on import
script_path = os.path.join(workpath, name + '-script.py')
print "creating script for entry point", dist, group, name
with open(script_path, 'w') as fh:
fh.write("import {0}\n".format(ep.module_name))
fh.write("{0}.{1}()\n".format(ep.module_name, '.'.join(ep.attrs)))
for package in packages:
fh.write("import {0}\n".format(package))
return Analysis([script_path] + scripts,
pathex=pathex,
binaries=binaries,
datas=datas,
hiddenimports=hiddenimports,
hookspath=hookspath,
excludes=excludes,
runtime_hooks=runtime_hooks,
cipher=cipher,
win_no_prefer_redirects=win_no_prefer_redirects,
win_private_assemblies=win_private_assemblies
)

Binary file not shown.

Binary file not shown.

View file

@ -1,82 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
LINUX=false
OSX=false
if [ "$(uname)" == "Darwin" ]; then
OSX=true
elif [ "$(expr substr $(uname -s) 1 5)" == "Linux" ]; then
LINUX=true
else
echo "Platform detection failed"
exit 1
fi
SUDO=''
if $LINUX && (( $EUID != 0 )); then
SUDO='sudo'
fi
cmd_exists() {
command -v "$1" >/dev/null 2>&1
return $?
}
set +eu
GITUSERNAME=$(git config --global --get user.name)
if [ -z "$GITUSERNAME" ]; then
git config --global user.name "$(whoami)"
fi
GITEMAIL=$(git config --global --get user.email)
if [ -z "$GITEMAIL" ]; then
git config --global user.email "$(whoami)@lbry.io"
fi
set -eu
if $LINUX; then
INSTALL="$SUDO apt-get install --no-install-recommends -y"
$INSTALL build-essential libssl-dev libffi-dev python2.7-dev wget
elif $OSX && ! cmd_exists brew ; then
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
fi
if ! cmd_exists python; then
if $LINUX; then
$INSTALL python2.7
elif $OSX; then
brew install python
curl https://bootstrap.pypa.io/get-pip.py | python
fi
fi
PYTHON_VERSION=$(python -c 'import sys; print(".".join(map(str, sys.version_info[:2])))')
if [ "$PYTHON_VERSION" != "2.7" ]; then
echo "Python 2.7 required"
exit 1
fi
if ! cmd_exists pip; then
if $LINUX; then
$INSTALL python-pip
$SUDO pip install --upgrade pip
else
echo "Pip required"
exit 1
fi
fi
if $LINUX && [ "$(pip list --format=columns | grep setuptools | wc -l)" -ge 1 ]; then
#$INSTALL python-setuptools
$SUDO pip install setuptools
fi
if ! cmd_exists virtualenv; then
$SUDO pip install virtualenv
fi

View file

@ -1,11 +0,0 @@
# install daemon requirements (created by build script. see build.sh, build.ps1)
-r requirements_base.txt
# install daemon itself. make sure you run `pip install` from this dir. this is how you do relative file paths with pip
file:../.
# install other build requirements
PyInstaller==3.2.1
requests[security]==2.13.0
uritemplate==3.0.0
boto3==1.4.4

View file

@ -1,29 +0,0 @@
"""Set the build version to be 'dev', 'qa', 'rc', 'release'"""
import os.path
import re
import subprocess
import sys
def main():
build = get_build()
root_dir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
with open(os.path.join(root_dir, 'lbrynet', 'build_type.py'), 'w') as f:
f.write("BUILD = '{}'\n".format(build))
def get_build():
try:
tag = subprocess.check_output(['git', 'describe', '--exact-match']).strip()
if re.match('v\d+\.\d+\.\d+rc\d+', tag):
return 'rc'
else:
return 'release'
except subprocess.CalledProcessError:
# if the build doesn't have a tag
return 'qa'
if __name__ == '__main__':
sys.exit(main())

View file

@ -1,143 +0,0 @@
import glob
import json
import os
import subprocess
import sys
import github
import uritemplate
import boto3
def main():
upload_to_github_if_tagged('lbryio/lbry')
upload_to_s3('daemon')
def get_asset_filename():
this_dir = os.path.dirname(os.path.realpath(__file__))
return glob.glob(this_dir + '/dist/*.zip')[0]
def upload_to_s3(folder):
tag = subprocess.check_output(['git', 'describe', '--always', '--abbrev=8', 'HEAD']).strip()
commit_date = subprocess.check_output([
'git', 'show', '-s', '--format=%cd', '--date=format:%Y%m%d-%H%I%S', 'HEAD']).strip()
asset_path = get_asset_filename()
bucket = 'releases.lbry.io'
key = folder + '/' + commit_date + '-' + tag + '/' + os.path.basename(asset_path)
print "Uploading " + asset_path + " to s3://" + bucket + '/' + key + ''
if 'AWS_ACCESS_KEY_ID' not in os.environ or 'AWS_SECRET_ACCESS_KEY' not in os.environ:
print 'Must set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY to publish assets to s3'
return 1
s3 = boto3.resource(
's3',
aws_access_key_id=os.environ['AWS_ACCESS_KEY_ID'],
aws_secret_access_key=os.environ['AWS_SECRET_ACCESS_KEY'],
config=boto3.session.Config(signature_version='s3v4')
)
s3.meta.client.upload_file(asset_path, bucket, key)
def upload_to_github_if_tagged(repo_name):
try:
current_tag = subprocess.check_output(
['git', 'describe', '--exact-match', 'HEAD']).strip()
except subprocess.CalledProcessError:
print 'Not uploading to GitHub as we are not currently on a tag'
return 1
print "Current tag: " + current_tag
if 'GH_TOKEN' not in os.environ:
print 'Must set GH_TOKEN in order to publish assets to a release'
return 1
gh_token = os.environ['GH_TOKEN']
auth = github.Github(gh_token)
repo = auth.get_repo(repo_name)
if not check_repo_has_tag(repo, current_tag):
print 'Tag {} is not in repo {}'.format(current_tag, repo)
# TODO: maybe this should be an error
return 1
asset_path = get_asset_filename()
print "Uploading " + asset_path + " to Github tag " + current_tag
release = get_github_release(repo, current_tag)
upload_asset_to_github(release, asset_path, gh_token)
def check_repo_has_tag(repo, target_tag):
tags = repo.get_tags().get_page(0)
for tag in tags:
if tag.name == target_tag:
return True
return False
def get_github_release(repo, current_tag):
for release in repo.get_releases():
if release.tag_name == current_tag:
return release
raise Exception('No release for {} was found'.format(current_tag))
def upload_asset_to_github(release, asset_to_upload, token):
basename = os.path.basename(asset_to_upload)
for asset in release.raw_data['assets']:
if asset['name'] == basename:
print 'File {} has already been uploaded to {}'.format(basename, release.tag_name)
return
upload_uri = uritemplate.expand(release.upload_url, {'name': basename})
count = 0
while count < 10:
try:
output = _curl_uploader(upload_uri, asset_to_upload, token)
if 'errors' in output:
raise Exception(output)
else:
print 'Successfully uploaded to {}'.format(output['browser_download_url'])
except Exception:
print 'Failed uploading on attempt {}'.format(count + 1)
count += 1
def _curl_uploader(upload_uri, asset_to_upload, token):
# using requests.post fails miserably with SSL EPIPE errors. I spent
# half a day trying to debug before deciding to switch to curl.
#
# TODO: actually set the content type
print 'Using curl to upload {} to {}'.format(asset_to_upload, upload_uri)
cmd = [
'curl',
'-sS',
'-X', 'POST',
'-u', ':{}'.format(os.environ['GH_TOKEN']),
'--header', 'Content-Type: application/octet-stream',
'--data-binary', '@-',
upload_uri
]
# '-d', '{"some_key": "some_value"}',
print 'Calling curl:'
print cmd
print
with open(asset_to_upload, 'rb') as fp:
p = subprocess.Popen(cmd, stdin=fp, stderr=subprocess.PIPE, stdout=subprocess.PIPE)
stdout, stderr = p.communicate()
print 'curl return code:', p.returncode
if stderr:
print 'stderr output from curl:'
print stderr
print 'stdout from curl:'
print stdout
return json.loads(stdout)
if __name__ == '__main__':
sys.exit(main())

View file

@ -1,29 +0,0 @@
import os
import platform
import subprocess
import sys
import zipfile
def main():
this_dir = os.path.dirname(os.path.realpath(__file__))
tag = subprocess.check_output(['git', 'describe']).strip()
zipfilename = 'lbrynet-daemon-{}-{}.zip'.format(tag, get_system_label())
full_filename = os.path.join(this_dir, 'dist', zipfilename)
executables = ['lbrynet-daemon', 'lbrynet-cli', 'lbrynet-console']
ext = '.exe' if platform.system() == 'Windows' else ''
with zipfile.ZipFile(full_filename, 'w') as myzip:
for executable in executables:
myzip.write(os.path.join(this_dir, 'dist', executable + ext), executable + ext)
def get_system_label():
system = platform.system()
if system == 'Darwin':
return 'macos'
else:
return system.lower()
if __name__ == '__main__':
sys.exit(main())

View file

Before

Width:  |  Height:  |  Size: 7.4 KiB

After

Width:  |  Height:  |  Size: 7.4 KiB

View file

Before

Width:  |  Height:  |  Size: 18 KiB

After

Width:  |  Height:  |  Size: 18 KiB

View file

Before

Width:  |  Height:  |  Size: 1.2 KiB

After

Width:  |  Height:  |  Size: 1.2 KiB

View file

Before

Width:  |  Height:  |  Size: 2.6 KiB

After

Width:  |  Height:  |  Size: 2.6 KiB

View file

Before

Width:  |  Height:  |  Size: 6.1 KiB

After

Width:  |  Height:  |  Size: 6.1 KiB

View file

Before

Width:  |  Height:  |  Size: 97 KiB

After

Width:  |  Height:  |  Size: 97 KiB

View file

Before

Width:  |  Height:  |  Size: 1.1 KiB

After

Width:  |  Height:  |  Size: 1.1 KiB

View file

Before

Width:  |  Height:  |  Size: 361 KiB

After

Width:  |  Height:  |  Size: 361 KiB

View file

Before

Width:  |  Height:  |  Size: 5.3 KiB

After

Width:  |  Height:  |  Size: 5.3 KiB

View file

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View file

Before

Width:  |  Height:  |  Size: 31 KiB

After

Width:  |  Height:  |  Size: 31 KiB

View file

@ -1,6 +1,6 @@
import logging
__version__ = "0.21.2"
__version__ = "0.30.0a"
version = tuple(__version__.split('.'))
logging.getLogger(__name__).addHandler(logging.NullHandler())

View file

@ -24,7 +24,7 @@ BLOB_BYTES_UPLOADED = 'Blob Bytes Uploaded'
log = logging.getLogger(__name__)
class Manager(object):
class Manager:
def __init__(self, analytics_api, context=None, installation_id=None, session_id=None):
self.analytics_api = analytics_api
self._tracked_data = collections.defaultdict(list)
@ -158,7 +158,7 @@ class Manager(object):
@staticmethod
def _download_properties(id_, name, claim_dict=None, report=None):
sd_hash = None if not claim_dict else claim_dict.source_hash
sd_hash = None if not claim_dict else claim_dict.source_hash.decode()
p = {
'download_id': id_,
'name': name,
@ -177,9 +177,9 @@ class Manager(object):
return {
'download_id': id_,
'name': name,
'stream_info': claim_dict.source_hash,
'stream_info': claim_dict.source_hash.decode(),
'error': error_name(error),
'reason': error.message,
'reason': str(error),
'report': report
}
@ -193,7 +193,7 @@ class Manager(object):
'build': platform['build'],
'wallet': {
'name': wallet,
'version': platform['lbryum_version'] if wallet == conf.LBRYUM_WALLET else None
'version': platform['lbrynet_version']
},
},
# TODO: expand os info to give linux/osx specific info
@ -219,7 +219,7 @@ class Manager(object):
callback(maybe_deferred, *args, **kwargs)
class Api(object):
class Api:
def __init__(self, cookies, url, write_key, enabled):
self.cookies = cookies
self.url = url

View file

@ -1 +1 @@
import paths
from . import paths

View file

@ -1,4 +1,4 @@
from blob_file import BlobFile
from creator import BlobFileCreator
from writer import HashBlobWriter
from reader import HashBlobReader
from .blob_file import BlobFile
from .creator import BlobFileCreator
from .writer import HashBlobWriter
from .reader import HashBlobReader

View file

@ -13,7 +13,7 @@ log = logging.getLogger(__name__)
MAX_BLOB_SIZE = 2 * 2 ** 20
class BlobFile(object):
class BlobFile:
"""
A chunk of data available on the network which is specified by a hashsum
@ -60,12 +60,12 @@ class BlobFile(object):
finished_deferred - deferred that is fired when write is finished and returns
a instance of itself as HashBlob
"""
if not peer in self.writers:
if peer not in self.writers:
log.debug("Opening %s to be written by %s", str(self), str(peer))
finished_deferred = defer.Deferred()
writer = HashBlobWriter(self.get_length, self.writer_finished)
self.writers[peer] = (writer, finished_deferred)
return (writer, finished_deferred)
return writer, finished_deferred
log.warning("Tried to download the same file twice simultaneously from the same peer")
return None, None
@ -149,7 +149,7 @@ class BlobFile(object):
def writer_finished(self, writer, err=None):
def fire_finished_deferred():
self._verified = True
for p, (w, finished_deferred) in self.writers.items():
for p, (w, finished_deferred) in list(self.writers.items()):
if w == writer:
del self.writers[p]
finished_deferred.callback(self)
@ -160,7 +160,7 @@ class BlobFile(object):
return False
def errback_finished_deferred(err):
for p, (w, finished_deferred) in self.writers.items():
for p, (w, finished_deferred) in list(self.writers.items()):
if w == writer:
del self.writers[p]
finished_deferred.errback(err)

View file

@ -8,7 +8,7 @@ from lbrynet.core.cryptoutils import get_lbry_hash_obj
log = logging.getLogger(__name__)
class BlobFileCreator(object):
class BlobFileCreator:
"""
This class is used to create blobs on the local filesystem
when we do not know the blob hash beforehand (i.e, when creating

View file

@ -3,7 +3,7 @@ import logging
log = logging.getLogger(__name__)
class HashBlobReader(object):
class HashBlobReader:
"""
This is a file like reader class that supports
read(size) and close()
@ -15,7 +15,7 @@ class HashBlobReader(object):
def __del__(self):
if self.finished_cb_d is None:
log.warn("Garbage collection was called, but reader for %s was not closed yet",
log.warning("Garbage collection was called, but reader for %s was not closed yet",
self.read_handle.name)
self.close()
@ -28,5 +28,3 @@ class HashBlobReader(object):
return
self.read_handle.close()
self.finished_cb_d = self.finished_cb(self)

View file

@ -7,7 +7,7 @@ from lbrynet.core.cryptoutils import get_lbry_hash_obj
log = logging.getLogger(__name__)
class HashBlobWriter(object):
class HashBlobWriter:
def __init__(self, length_getter, finished_cb):
self.write_handle = BytesIO()
self.length_getter = length_getter
@ -18,7 +18,7 @@ class HashBlobWriter(object):
def __del__(self):
if self.finished_cb_d is None:
log.warn("Garbage collection was called, but writer was not closed yet")
log.warning("Garbage collection was called, but writer was not closed yet")
self.close()
@property

162
lbrynet/cli.py Normal file
View file

@ -0,0 +1,162 @@
import sys
from twisted.internet import asyncioreactor
if 'twisted.internet.reactor' not in sys.modules:
asyncioreactor.install()
else:
from twisted.internet import reactor
if not isinstance(reactor, asyncioreactor.AsyncioSelectorReactor):
# pyinstaller hooks install the default reactor before
# any of our code runs, see kivy for similar problem:
# https://github.com/kivy/kivy/issues/4182
del sys.modules['twisted.internet.reactor']
asyncioreactor.install()
import json
import asyncio
from aiohttp.client_exceptions import ClientConnectorError
from requests.exceptions import ConnectionError
from docopt import docopt
from textwrap import dedent
from lbrynet.daemon.Daemon import Daemon
from lbrynet.daemon.DaemonControl import start as daemon_main
from lbrynet.daemon.DaemonConsole import main as daemon_console
from lbrynet.daemon.auth.client import LBRYAPIClient
from lbrynet.core.system_info import get_platform
async def execute_command(method, params, conf_path=None):
# this check if the daemon is running or not
try:
api = await LBRYAPIClient.get_client(conf_path)
await api.status()
except (ClientConnectorError, ConnectionError):
await api.session.close()
print("Could not connect to daemon. Are you sure it's running?")
return 1
# this actually executes the method
try:
resp = await api.call(method, params)
await api.session.close()
print(json.dumps(resp["result"], indent=2))
except KeyError:
if resp["error"]["code"] == -32500:
print(json.dumps(resp["error"], indent=2))
else:
print(json.dumps(resp["error"]["message"], indent=2))
def print_help():
print(dedent("""
NAME
lbrynet - LBRY command line client.
USAGE
lbrynet [--conf <config file>] <command> [<args>]
EXAMPLES
lbrynet commands # list available commands
lbrynet status # get daemon status
lbrynet --conf ~/l1.conf status # like above but using ~/l1.conf as config file
lbrynet resolve_name what # resolve a name
lbrynet help resolve_name # get help for a command
"""))
def print_help_for_command(command):
fn = Daemon.callable_methods.get(command)
if fn:
print(dedent(fn.__doc__))
else:
print("Invalid command name")
def normalize_value(x, key=None):
if not isinstance(x, str):
return x
if key in ('uri', 'channel_name', 'name', 'file_name', 'download_directory'):
return x
if x.lower() == 'true':
return True
if x.lower() == 'false':
return False
if x.isdigit():
return int(x)
return x
def remove_brackets(key):
if key.startswith("<") and key.endswith(">"):
return str(key[1:-1])
return key
def set_kwargs(parsed_args):
kwargs = {}
for key, arg in parsed_args.items():
k = None
if arg is None:
continue
elif key.startswith("--") and remove_brackets(key[2:]) not in kwargs:
k = remove_brackets(key[2:])
elif remove_brackets(key) not in kwargs:
k = remove_brackets(key)
kwargs[k] = normalize_value(arg, k)
return kwargs
def main(argv=None):
argv = argv or sys.argv[1:]
if not argv:
print_help()
return 1
conf_path = None
if len(argv) and argv[0] == "--conf":
if len(argv) < 2:
print("No config file specified for --conf option")
print_help()
return 1
conf_path = argv[1]
argv = argv[2:]
method, args = argv[0], argv[1:]
if method in ['help', '--help', '-h']:
if len(args) == 1:
print_help_for_command(args[0])
else:
print_help()
return 0
elif method in ['version', '--version', '-v']:
print(json.dumps(get_platform(get_ip=False), sort_keys=True, indent=2, separators=(',', ': ')))
return 0
elif method == 'start':
sys.exit(daemon_main(args, conf_path))
elif method == 'console':
sys.exit(daemon_console())
elif method not in Daemon.callable_methods:
if method not in Daemon.deprecated_methods:
print('{} is not a valid command.'.format(method))
return 1
new_method = Daemon.deprecated_methods[method].new_command
print("{} is deprecated, using {}.".format(method, new_method))
method = new_method
fn = Daemon.callable_methods[method]
parsed = docopt(fn.__doc__, args)
params = set_kwargs(parsed)
loop = asyncio.get_event_loop()
loop.run_until_complete(execute_command(method, params, conf_path))
return 0
if __name__ == "__main__":
sys.exit(main())

View file

@ -29,6 +29,7 @@ ENV_NAMESPACE = 'LBRY_'
LBRYCRD_WALLET = 'lbrycrd'
LBRYUM_WALLET = 'lbryum'
PTC_WALLET = 'ptc'
TORBA_WALLET = 'torba'
PROTOCOL_PREFIX = 'lbry'
APP_NAME = 'LBRY'
@ -62,22 +63,6 @@ settings_encoders = {
conf_file = None
def _win_path_to_bytes(path):
"""
Encode Windows paths to string. appdirs.user_data_dir()
on windows will return unicode path, unlike other platforms
which returns string. This will cause problems
because we use strings for filenames and combining them with
os.path.join() will result in errors.
"""
for encoding in ('ASCII', 'MBCS'):
try:
return path.encode(encoding)
except (UnicodeEncodeError, LookupError):
pass
return path
def _get_old_directories(platform_type):
directories = {}
if platform_type == WINDOWS:
@ -142,9 +127,6 @@ elif 'win' in sys.platform:
dirs = _get_old_directories(WINDOWS)
else:
dirs = _get_new_directories(WINDOWS)
dirs['data'] = _win_path_to_bytes(dirs['data'])
dirs['lbryum'] = _win_path_to_bytes(dirs['lbryum'])
dirs['download'] = _win_path_to_bytes(dirs['download'])
else:
platform = LINUX
if os.path.isdir(_get_old_directories(LINUX)['data']) or \
@ -182,11 +164,11 @@ class Env(envparse.Env):
self._convert_key(key): self._convert_value(value)
for key, value in schema.items()
}
envparse.Env.__init__(self, **my_schema)
super().__init__(**my_schema)
def __call__(self, key, *args, **kwargs):
my_key = self._convert_key(key)
return super(Env, self).__call__(my_key, *args, **kwargs)
return super().__call__(my_key, *args, **kwargs)
@staticmethod
def _convert_key(key):
@ -307,12 +289,12 @@ ADJUSTABLE_SETTINGS = {
}
class Config(object):
class Config:
def __init__(self, fixed_defaults, adjustable_defaults, persisted_settings=None,
environment=None, cli_settings=None):
self._installation_id = None
self._session_id = base58.b58encode(utils.generate_id())
self._session_id = base58.b58encode(utils.generate_id()).decode()
self._node_id = None
self._fixed_defaults = fixed_defaults
@ -338,7 +320,7 @@ class Config(object):
self._data[TYPE_DEFAULT].update(self._fixed_defaults)
self._data[TYPE_DEFAULT].update(
{k: v[1] for (k, v) in self._adjustable_defaults.iteritems()})
{k: v[1] for (k, v) in self._adjustable_defaults.items()})
if persisted_settings is None:
persisted_settings = {}
@ -358,7 +340,7 @@ class Config(object):
return self.get_current_settings_dict().__repr__()
def __iter__(self):
for k in self._data[TYPE_DEFAULT].iterkeys():
for k in self._data[TYPE_DEFAULT].keys():
yield k
def __getitem__(self, name):
@ -481,7 +463,7 @@ class Config(object):
self._data[data_type][name] = value
def update(self, updated_settings, data_types=(TYPE_RUNTIME,)):
for k, v in updated_settings.iteritems():
for k, v in updated_settings.items():
try:
self.set(k, v, data_types=data_types)
except (KeyError, AssertionError):
@ -495,7 +477,7 @@ class Config(object):
def get_adjustable_settings_dict(self):
return {
key: val for key, val in self.get_current_settings_dict().iteritems()
key: val for key, val in self.get_current_settings_dict().items()
if key in self._adjustable_defaults
}
@ -516,7 +498,7 @@ class Config(object):
@staticmethod
def _convert_conf_file_lists_reverse(converted):
rev = {}
for k in converted.iterkeys():
for k in converted.keys():
if k in ADJUSTABLE_SETTINGS and len(ADJUSTABLE_SETTINGS[k]) == 4:
rev[k] = ADJUSTABLE_SETTINGS[k][3](converted[k])
else:
@ -526,7 +508,7 @@ class Config(object):
@staticmethod
def _convert_conf_file_lists(decoded):
converted = {}
for k, v in decoded.iteritems():
for k, v in decoded.items():
if k in ADJUSTABLE_SETTINGS and len(ADJUSTABLE_SETTINGS[k]) >= 3:
converted[k] = ADJUSTABLE_SETTINGS[k][2](v)
else:
@ -570,7 +552,7 @@ class Config(object):
if 'share_debug_info' in settings_dict:
settings_dict['share_usage_data'] = settings_dict['share_debug_info']
del settings_dict['share_debug_info']
for key in settings_dict.keys():
for key in list(settings_dict.keys()):
if not self._is_valid_setting(key):
log.warning('Ignoring invalid conf file setting: %s', key)
del settings_dict[key]
@ -618,7 +600,7 @@ class Config(object):
with open(install_id_filename, "r") as install_id_file:
self._installation_id = str(install_id_file.read()).strip()
if not self._installation_id:
self._installation_id = base58.b58encode(utils.generate_id())
self._installation_id = base58.b58encode(utils.generate_id()).decode()
with open(install_id_filename, "w") as install_id_file:
install_id_file.write(self._installation_id)
return self._installation_id
@ -632,20 +614,19 @@ class Config(object):
if not self._node_id:
self._node_id = utils.generate_id()
with open(node_id_filename, "w") as node_id_file:
node_id_file.write(base58.b58encode(self._node_id))
node_id_file.write(base58.b58encode(self._node_id).decode())
return self._node_id
def get_session_id(self):
return self._session_id
# type: Config
settings = None
settings = None # type: Config
def get_default_env():
env_defaults = {}
for k, v in ADJUSTABLE_SETTINGS.iteritems():
for k, v in ADJUSTABLE_SETTINGS.items():
if len(v) == 3:
env_defaults[k] = (v[0], None, v[2])
elif len(v) == 4:

View file

@ -9,7 +9,7 @@ from decimal import Decimal
log = logging.getLogger(__name__)
class BlobAvailabilityTracker(object):
class BlobAvailabilityTracker:
"""
Class to track peer counts for known blobs, and to discover new popular blobs

View file

@ -1,4 +1,4 @@
class BlobInfo(object):
class BlobInfo:
"""
This structure is used to represent the metadata of a blob.
@ -16,4 +16,3 @@ class BlobInfo(object):
self.blob_hash = blob_hash
self.blob_num = blob_num
self.length = length

View file

@ -1,5 +1,6 @@
import logging
import os
from binascii import unhexlify
from sqlite3 import IntegrityError
from twisted.internet import threads, defer
from lbrynet.blob.blob_file import BlobFile
@ -8,7 +9,7 @@ from lbrynet.blob.creator import BlobFileCreator
log = logging.getLogger(__name__)
class DiskBlobManager(object):
class DiskBlobManager:
def __init__(self, blob_dir, storage, node_datastore=None):
"""
This class stores blobs on the hard disk
@ -60,7 +61,7 @@ class DiskBlobManager(object):
blob.blob_hash, blob.length, next_announce_time, should_announce
)
if self._node_datastore is not None:
self._node_datastore.completed_blobs.add(blob.blob_hash.decode('hex'))
self._node_datastore.completed_blobs.add(unhexlify(blob.blob_hash))
def completed_blobs(self, blobhashes_to_check):
return self._completed_blobs(blobhashes_to_check)
@ -100,7 +101,7 @@ class DiskBlobManager(object):
continue
if self._node_datastore is not None:
try:
self._node_datastore.completed_blobs.remove(blob_hash.decode('hex'))
self._node_datastore.completed_blobs.remove(unhexlify(blob_hash))
except KeyError:
pass
try:
@ -113,7 +114,7 @@ class DiskBlobManager(object):
try:
yield self.storage.delete_blobs_from_db(bh_to_delete_from_db)
except IntegrityError as err:
if err.message != "FOREIGN KEY constraint failed":
if str(err) != "FOREIGN KEY constraint failed":
raise err
@defer.inlineCallbacks

View file

@ -1,4 +1,4 @@
class DownloadOptionChoice(object):
class DownloadOptionChoice:
"""A possible choice that can be picked for some option.
An option can have one or more choices that can be picked from.
@ -10,7 +10,7 @@ class DownloadOptionChoice(object):
self.bool_options_description = bool_options_description
class DownloadOption(object):
class DownloadOption:
"""An option for a user to select a value from several different choices."""
def __init__(self, option_types, long_description, short_description, default_value,
default_value_description):

View file

@ -1,3 +1,7 @@
class RPCError(Exception):
code = 0
class PriceDisagreementError(Exception):
pass
@ -12,19 +16,19 @@ class DownloadCanceledError(Exception):
class DownloadSDTimeout(Exception):
def __init__(self, download):
Exception.__init__(self, 'Failed to download sd blob {} within timeout'.format(download))
super().__init__('Failed to download sd blob {} within timeout'.format(download))
self.download = download
class DownloadTimeoutError(Exception):
def __init__(self, download):
Exception.__init__(self, 'Failed to download {} within timeout'.format(download))
super().__init__('Failed to download {} within timeout'.format(download))
self.download = download
class DownloadDataTimeout(Exception):
def __init__(self, download):
Exception.__init__(self, 'Failed to download data blobs for sd hash '
super().__init__('Failed to download data blobs for sd hash '
'{} within timeout'.format(download))
self.download = download
@ -41,8 +45,8 @@ class NullFundsError(Exception):
pass
class InsufficientFundsError(Exception):
pass
class InsufficientFundsError(RPCError):
code = -310
class ConnectionClosedBeforeResponseError(Exception):
@ -55,39 +59,41 @@ class KeyFeeAboveMaxAllowed(Exception):
class InvalidExchangeRateResponse(Exception):
def __init__(self, source, reason):
Exception.__init__(self, 'Failed to get exchange rate from {}:{}'.format(source, reason))
super().__init__('Failed to get exchange rate from {}:{}'.format(source, reason))
self.source = source
self.reason = reason
class UnknownNameError(Exception):
def __init__(self, name):
Exception.__init__(self, 'Name {} is unknown'.format(name))
super().__init__('Name {} is unknown'.format(name))
self.name = name
class UnknownClaimID(Exception):
def __init__(self, claim_id):
Exception.__init__(self, 'Claim {} is unknown'.format(claim_id))
super().__init__('Claim {} is unknown'.format(claim_id))
self.claim_id = claim_id
class UnknownURI(Exception):
def __init__(self, uri):
Exception.__init__(self, 'URI {} cannot be resolved'.format(uri))
super().__init__('URI {} cannot be resolved'.format(uri))
self.name = uri
class UnknownOutpoint(Exception):
def __init__(self, outpoint):
Exception.__init__(self, 'Outpoint {} cannot be resolved'.format(outpoint))
super().__init__('Outpoint {} cannot be resolved'.format(outpoint))
self.outpoint = outpoint
class InvalidName(Exception):
def __init__(self, name, invalid_characters):
self.name = name
self.invalid_characters = invalid_characters
Exception.__init__(
self, 'URI contains invalid characters: {}'.format(','.join(invalid_characters)))
super().__init__(
'URI contains invalid characters: {}'.format(','.join(invalid_characters)))
class UnknownStreamTypeError(Exception):
@ -105,7 +111,7 @@ class InvalidStreamDescriptorError(Exception):
class InvalidStreamInfoError(Exception):
def __init__(self, name, stream_info):
msg = '{} has claim with invalid stream info: {}'.format(name, stream_info)
Exception.__init__(self, msg)
super().__init__(msg)
self.name = name
self.stream_info = stream_info
@ -159,14 +165,14 @@ class NegotiationError(Exception):
class InvalidCurrencyError(Exception):
def __init__(self, currency):
self.currency = currency
Exception.__init__(
self, 'Invalid currency: {} is not a supported currency.'.format(currency))
super().__init__(
'Invalid currency: {} is not a supported currency.'.format(currency))
class NoSuchDirectoryError(Exception):
def __init__(self, directory):
self.directory = directory
Exception.__init__(self, 'No such directory {}'.format(directory))
super().__init__('No such directory {}'.format(directory))
class ComponentStartConditionNotMet(Exception):

View file

@ -9,7 +9,7 @@ from lbrynet.core.Error import DownloadCanceledError
log = logging.getLogger(__name__)
class HTTPBlobDownloader(object):
class HTTPBlobDownloader:
'''
A downloader that is able to get blobs from HTTP mirrors.
Note that when a blob gets downloaded from a mirror or from a peer, BlobManager will mark it as completed

View file

@ -1,7 +1,7 @@
from decimal import Decimal
class Offer(object):
class Offer:
"""A rate offer to download blobs from a host."""
RATE_ACCEPTED = "RATE_ACCEPTED"

View file

@ -3,14 +3,14 @@ from lbrynet import conf
from decimal import Decimal
class BasePaymentRateManager(object):
class BasePaymentRateManager:
def __init__(self, rate=None, info_rate=None):
self.min_blob_data_payment_rate = rate if rate is not None else conf.settings['data_rate']
self.min_blob_info_payment_rate = (
info_rate if info_rate is not None else conf.settings['min_info_rate'])
class PaymentRateManager(object):
class PaymentRateManager:
def __init__(self, base, rate=None):
"""
@param base: a BasePaymentRateManager
@ -36,7 +36,7 @@ class PaymentRateManager(object):
self.points_paid += amount
class NegotiatedPaymentRateManager(object):
class NegotiatedPaymentRateManager:
def __init__(self, base, availability_tracker, generous=None):
"""
@param base: a BasePaymentRateManager
@ -84,7 +84,7 @@ class NegotiatedPaymentRateManager(object):
return False
class OnlyFreePaymentsManager(object):
class OnlyFreePaymentsManager:
def __init__(self, **kwargs):
"""
A payment rate manager that will only ever accept and offer a rate of 0.0,

View file

@ -3,7 +3,7 @@ from collections import defaultdict
from lbrynet.core import utils
# Do not create this object except through PeerManager
class Peer(object):
class Peer:
def __init__(self, host, port):
self.host = host
self.port = port

View file

@ -1,7 +1,7 @@
from lbrynet.core.Peer import Peer
class PeerManager(object):
class PeerManager:
def __init__(self):
self.peers = []

View file

@ -9,7 +9,7 @@ def get_default_price_model(blob_tracker, base_price, **kwargs):
return MeanAvailabilityWeightedPrice(blob_tracker, base_price, **kwargs)
class ZeroPrice(object):
class ZeroPrice:
def __init__(self):
self.base_price = 0.0
@ -17,7 +17,7 @@ class ZeroPrice(object):
return 0.0
class MeanAvailabilityWeightedPrice(object):
class MeanAvailabilityWeightedPrice:
"""Calculate mean-blob-availability and stream-position weighted price for a blob
Attributes:

View file

@ -1,14 +1,12 @@
import logging
from zope.interface import implements
from lbrynet.interfaces import IRateLimiter
from twisted.internet import task
log = logging.getLogger(__name__)
class DummyRateLimiter(object):
class DummyRateLimiter:
def __init__(self):
self.dl_bytes_this_second = 0
self.ul_bytes_this_second = 0
@ -46,10 +44,10 @@ class DummyRateLimiter(object):
self.total_ul_bytes += num_bytes
class RateLimiter(object):
class RateLimiter:
"""This class ensures that upload and download rates don't exceed specified maximums"""
implements(IRateLimiter)
#implements(IRateLimiter)
#called by main application

View file

@ -19,7 +19,7 @@ log = logging.getLogger(__name__)
class SinglePeerFinder(DummyPeerFinder):
def __init__(self, peer):
DummyPeerFinder.__init__(self)
super().__init__()
self.peer = peer
def find_peers_for_blob(self, blob_hash, timeout=None, filter_self=False):
@ -28,7 +28,7 @@ class SinglePeerFinder(DummyPeerFinder):
class BlobCallback(BlobFile):
def __init__(self, blob_dir, blob_hash, timeout):
BlobFile.__init__(self, blob_dir, blob_hash)
super().__init__(blob_dir, blob_hash)
self.callback = defer.Deferred()
reactor.callLater(timeout, self._cancel)
@ -43,7 +43,7 @@ class BlobCallback(BlobFile):
return result
class SingleBlobDownloadManager(object):
class SingleBlobDownloadManager:
def __init__(self, blob):
self.blob = blob
@ -57,7 +57,7 @@ class SingleBlobDownloadManager(object):
return self.blob.blob_hash
class SinglePeerDownloader(object):
class SinglePeerDownloader:
def __init__(self):
self._payment_rate_manager = OnlyFreePaymentsManager()
self._rate_limiter = DummyRateLimiter()

View file

@ -10,7 +10,7 @@ def get_default_strategy(blob_tracker, **kwargs):
return BasicAvailabilityWeightedStrategy(blob_tracker, **kwargs)
class Strategy(object):
class Strategy:
"""
Base for negotiation strategies
"""
@ -109,7 +109,7 @@ class BasicAvailabilityWeightedStrategy(Strategy):
base_price=0.0001, alpha=1.0):
price_model = MeanAvailabilityWeightedPrice(
blob_tracker, base_price=base_price, alpha=alpha)
Strategy.__init__(self, price_model, max_rate, min_rate, is_generous)
super().__init__(price_model, max_rate, min_rate, is_generous)
self._acceleration = Decimal(acceleration) # rate of how quickly to ramp offer
self._deceleration = Decimal(deceleration)
@ -140,7 +140,7 @@ class OnlyFreeStrategy(Strategy):
implementer(INegotiationStrategy)
def __init__(self, *args, **kwargs):
price_model = ZeroPrice()
Strategy.__init__(self, price_model, 0.0, 0.0, True)
super().__init__(price_model, 0.0, 0.0, True)
def _get_mean_rate(self, rates):
return 0.0

View file

@ -1,4 +1,5 @@
import binascii
from binascii import unhexlify
import string
from collections import defaultdict
import json
import logging
@ -12,7 +13,14 @@ from lbrynet.core.HTTPBlobDownloader import HTTPBlobDownloader
log = logging.getLogger(__name__)
class StreamDescriptorReader(object):
class JSONBytesEncoder(json.JSONEncoder):
def default(self, obj): # pylint: disable=E0202
if isinstance(obj, bytes):
return obj.decode()
return super().default(obj)
class StreamDescriptorReader:
"""Classes which derive from this class read a stream descriptor file return
a dictionary containing the fields in the file"""
def __init__(self):
@ -33,7 +41,7 @@ class StreamDescriptorReader(object):
class PlainStreamDescriptorReader(StreamDescriptorReader):
"""Read a stream descriptor file which is not a blob but a regular file"""
def __init__(self, stream_descriptor_filename):
StreamDescriptorReader.__init__(self)
super().__init__()
self.stream_descriptor_filename = stream_descriptor_filename
def _get_raw_data(self):
@ -49,7 +57,7 @@ class PlainStreamDescriptorReader(StreamDescriptorReader):
class BlobStreamDescriptorReader(StreamDescriptorReader):
"""Read a stream descriptor file which is a blob"""
def __init__(self, blob):
StreamDescriptorReader.__init__(self)
super().__init__()
self.blob = blob
def _get_raw_data(self):
@ -66,14 +74,16 @@ class BlobStreamDescriptorReader(StreamDescriptorReader):
return threads.deferToThread(get_data)
class StreamDescriptorWriter(object):
class StreamDescriptorWriter:
"""Classes which derive from this class write fields from a dictionary
of fields to a stream descriptor"""
def __init__(self):
pass
def create_descriptor(self, sd_info):
return self._write_stream_descriptor(json.dumps(sd_info))
return self._write_stream_descriptor(
json.dumps(sd_info, sort_keys=True).encode()
)
def _write_stream_descriptor(self, raw_data):
"""This method must be overridden by subclasses to write raw data to
@ -84,7 +94,7 @@ class StreamDescriptorWriter(object):
class PlainStreamDescriptorWriter(StreamDescriptorWriter):
def __init__(self, sd_file_name):
StreamDescriptorWriter.__init__(self)
super().__init__()
self.sd_file_name = sd_file_name
def _write_stream_descriptor(self, raw_data):
@ -100,7 +110,7 @@ class PlainStreamDescriptorWriter(StreamDescriptorWriter):
class BlobStreamDescriptorWriter(StreamDescriptorWriter):
def __init__(self, blob_manager):
StreamDescriptorWriter.__init__(self)
super().__init__()
self.blob_manager = blob_manager
@defer.inlineCallbacks
@ -114,7 +124,7 @@ class BlobStreamDescriptorWriter(StreamDescriptorWriter):
defer.returnValue(sd_hash)
class StreamMetadata(object):
class StreamMetadata:
FROM_BLOB = 1
FROM_PLAIN = 2
@ -127,7 +137,7 @@ class StreamMetadata(object):
self.source_file = None
class StreamDescriptorIdentifier(object):
class StreamDescriptorIdentifier:
"""Tries to determine the type of stream described by the stream descriptor using the
'stream_type' field. Keeps a list of StreamDescriptorValidators and StreamDownloaderFactorys
and returns the appropriate ones based on the type of the stream descriptor given
@ -254,7 +264,7 @@ def save_sd_info(blob_manager, sd_hash, sd_info):
(sd_hash, calculated_sd_hash))
stream_hash = yield blob_manager.storage.get_stream_hash_for_sd_hash(sd_hash)
if not stream_hash:
log.debug("Saving info for %s", sd_info['stream_name'].decode('hex'))
log.debug("Saving info for %s", unhexlify(sd_info['stream_name']))
stream_name = sd_info['stream_name']
key = sd_info['key']
stream_hash = sd_info['stream_hash']
@ -272,9 +282,9 @@ def format_blobs(crypt_blob_infos):
for blob_info in crypt_blob_infos:
blob = {}
if blob_info.length != 0:
blob['blob_hash'] = str(blob_info.blob_hash)
blob['blob_hash'] = blob_info.blob_hash
blob['blob_num'] = blob_info.blob_num
blob['iv'] = str(blob_info.iv)
blob['iv'] = blob_info.iv
blob['length'] = blob_info.length
formatted_blobs.append(blob)
return formatted_blobs
@ -344,18 +354,18 @@ def get_blob_hashsum(b):
iv = b['iv']
blob_hashsum = get_lbry_hash_obj()
if length != 0:
blob_hashsum.update(blob_hash)
blob_hashsum.update(str(blob_num))
blob_hashsum.update(iv)
blob_hashsum.update(str(length))
blob_hashsum.update(blob_hash.encode())
blob_hashsum.update(str(blob_num).encode())
blob_hashsum.update(iv.encode())
blob_hashsum.update(str(length).encode())
return blob_hashsum.digest()
def get_stream_hash(hex_stream_name, key, hex_suggested_file_name, blob_infos):
h = get_lbry_hash_obj()
h.update(hex_stream_name)
h.update(key)
h.update(hex_suggested_file_name)
h.update(hex_stream_name.encode())
h.update(key.encode())
h.update(hex_suggested_file_name.encode())
blobs_hashsum = get_lbry_hash_obj()
for blob in blob_infos:
blobs_hashsum.update(get_blob_hashsum(blob))
@ -364,9 +374,8 @@ def get_stream_hash(hex_stream_name, key, hex_suggested_file_name, blob_infos):
def verify_hex(text, field_name):
for c in text:
if c not in '0123456789abcdef':
raise InvalidStreamDescriptorError("%s is not a hex-encoded string" % field_name)
if not set(text).issubset(set(string.hexdigits)):
raise InvalidStreamDescriptorError("%s is not a hex-encoded string" % field_name)
def validate_descriptor(stream_info):
@ -397,7 +406,7 @@ def validate_descriptor(stream_info):
return True
class EncryptedFileStreamDescriptorValidator(object):
class EncryptedFileStreamDescriptorValidator:
def __init__(self, raw_info):
self.raw_info = raw_info
@ -406,14 +415,14 @@ class EncryptedFileStreamDescriptorValidator(object):
def info_to_show(self):
info = []
info.append(("stream_name", binascii.unhexlify(self.raw_info.get("stream_name"))))
info.append(("stream_name", unhexlify(self.raw_info.get("stream_name"))))
size_so_far = 0
for blob_info in self.raw_info.get("blobs", []):
size_so_far += int(blob_info['length'])
info.append(("stream_size", str(self.get_length_of_stream())))
suggested_file_name = self.raw_info.get("suggested_file_name", None)
if suggested_file_name is not None:
suggested_file_name = binascii.unhexlify(suggested_file_name)
suggested_file_name = unhexlify(suggested_file_name)
info.append(("suggested_file_name", suggested_file_name))
return info

File diff suppressed because it is too large Load diff

View file

@ -8,7 +8,7 @@ DELAY_INCREMENT = 0.0001
QUEUE_SIZE_THRESHOLD = 100
class CallLaterManager(object):
class CallLaterManager:
def __init__(self, callLater):
"""
:param callLater: (IReactorTime.callLater)

View file

@ -5,13 +5,11 @@ from decimal import Decimal
from twisted.internet import defer
from twisted.python.failure import Failure
from twisted.internet.error import ConnectionAborted
from zope.interface import implements
from lbrynet.core.Error import ConnectionClosedBeforeResponseError
from lbrynet.core.Error import InvalidResponseError, RequestCanceledError, NoResponseError
from lbrynet.core.Error import PriceDisagreementError, DownloadCanceledError, InsufficientFundsError
from lbrynet.core.client.ClientRequest import ClientRequest, ClientBlobRequest
from lbrynet.interfaces import IRequestCreator
from lbrynet.core.Offer import Offer
@ -39,8 +37,8 @@ def cache(fn):
return helper
class BlobRequester(object):
implements(IRequestCreator)
class BlobRequester:
#implements(IRequestCreator)
def __init__(self, blob_manager, peer_finder, payment_rate_manager, wallet, download_manager):
self.blob_manager = blob_manager
@ -163,7 +161,7 @@ class BlobRequester(object):
return True
def _get_bad_peers(self):
return [p for p in self._peers.iterkeys() if not self._should_send_request_to(p)]
return [p for p in self._peers.keys() if not self._should_send_request_to(p)]
def _hash_available(self, blob_hash):
for peer in self._available_blobs:
@ -195,7 +193,7 @@ class BlobRequester(object):
self._peers[peer] += amount
class RequestHelper(object):
class RequestHelper:
def __init__(self, requestor, peer, protocol, payment_rate_manager):
self.requestor = requestor
self.peer = peer
@ -429,7 +427,7 @@ class PriceRequest(RequestHelper):
class DownloadRequest(RequestHelper):
"""Choose a blob and download it from a peer and also pay the peer for the data."""
def __init__(self, requester, peer, protocol, payment_rate_manager, wallet, head_blob_hash):
RequestHelper.__init__(self, requester, peer, protocol, payment_rate_manager)
super().__init__(requester, peer, protocol, payment_rate_manager)
self.wallet = wallet
self.head_blob_hash = head_blob_hash
@ -578,7 +576,7 @@ class DownloadRequest(RequestHelper):
return reason
class BlobDownloadDetails(object):
class BlobDownloadDetails:
"""Contains the information needed to make a ClientBlobRequest from an open blob"""
def __init__(self, blob, deferred, write_func, cancel_func, peer):
self.blob = blob

View file

@ -10,8 +10,6 @@ from lbrynet.core import utils
from lbrynet.core.Error import ConnectionClosedBeforeResponseError, NoResponseError
from lbrynet.core.Error import DownloadCanceledError, MisbehavingPeerError
from lbrynet.core.Error import RequestCanceledError
from lbrynet.interfaces import IRequestSender, IRateLimited
from zope.interface import implements
log = logging.getLogger(__name__)
@ -24,7 +22,7 @@ def encode_decimal(obj):
class ClientProtocol(Protocol, TimeoutMixin):
implements(IRequestSender, IRateLimited)
#implements(IRequestSender, IRateLimited)
######### Protocol #########
PROTOCOL_TIMEOUT = 30
@ -34,7 +32,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
self._rate_limiter = self.factory.rate_limiter
self.peer = self.factory.peer
self._response_deferreds = {}
self._response_buff = ''
self._response_buff = b''
self._downloading_blob = False
self._blob_download_request = None
self._next_request = {}
@ -61,7 +59,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
self.transport.loseConnection()
response, extra_data = self._get_valid_response(self._response_buff)
if response is not None:
self._response_buff = ''
self._response_buff = b''
self._handle_response(response)
if self._downloading_blob is True and len(extra_data) != 0:
self._blob_download_request.write(extra_data)
@ -71,17 +69,17 @@ class ClientProtocol(Protocol, TimeoutMixin):
self.peer.report_down()
self.transport.abortConnection()
def connectionLost(self, reason):
def connectionLost(self, reason=None):
log.debug("Connection lost to %s: %s", self.peer, reason)
self.setTimeout(None)
self.connection_closed = True
if reason.check(error.ConnectionDone):
if reason is None or reason.check(error.ConnectionDone):
err = failure.Failure(ConnectionClosedBeforeResponseError())
else:
err = reason
for key, d in self._response_deferreds.items():
del self._response_deferreds[key]
d.errback(err)
self._response_deferreds.clear()
if self._blob_download_request is not None:
self._blob_download_request.cancel(err)
self.factory.connection_was_made_deferred.callback(True)
@ -111,7 +109,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
self.connection_closing = True
ds = []
err = RequestCanceledError()
for key, d in self._response_deferreds.items():
for key, d in list(self._response_deferreds.items()):
del self._response_deferreds[key]
d.errback(err)
ds.append(d)
@ -126,7 +124,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
def _handle_request_error(self, err):
log.error("An unexpected error occurred creating or sending a request to %s. %s: %s",
self.peer, err.type, err.message)
self.peer, err.type, err)
self.transport.loseConnection()
def _ask_for_request(self):
@ -151,7 +149,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
self.setTimeout(self.PROTOCOL_TIMEOUT)
# TODO: compare this message to the last one. If they're the same,
# TODO: incrementally delay this message.
m = json.dumps(request_msg, default=encode_decimal)
m = json.dumps(request_msg, default=encode_decimal).encode()
self.transport.write(m)
def _get_valid_response(self, response_msg):
@ -159,7 +157,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
response = None
curr_pos = 0
while 1:
next_close_paren = response_msg.find('}', curr_pos)
next_close_paren = response_msg.find(b'}', curr_pos)
if next_close_paren != -1:
curr_pos = next_close_paren + 1
try:

View file

@ -1,7 +1,7 @@
from lbrynet.blob.blob_file import MAX_BLOB_SIZE
class ClientRequest(object):
class ClientRequest:
def __init__(self, request_dict, response_identifier=None):
self.request_dict = request_dict
self.response_identifier = response_identifier
@ -9,7 +9,7 @@ class ClientRequest(object):
class ClientPaidRequest(ClientRequest):
def __init__(self, request_dict, response_identifier, max_pay_units):
ClientRequest.__init__(self, request_dict, response_identifier)
super().__init__(request_dict, response_identifier)
self.max_pay_units = max_pay_units
@ -20,7 +20,7 @@ class ClientBlobRequest(ClientPaidRequest):
max_pay_units = MAX_BLOB_SIZE
else:
max_pay_units = blob.length
ClientPaidRequest.__init__(self, request_dict, response_identifier, max_pay_units)
super().__init__(request_dict, response_identifier, max_pay_units)
self.write = write_func
self.finished_deferred = finished_deferred
self.cancel = cancel_func

View file

@ -1,8 +1,6 @@
import random
import logging
from twisted.internet import defer, reactor
from zope.interface import implements
from lbrynet import interfaces
from lbrynet import conf
from lbrynet.core.client.ClientProtocol import ClientProtocolFactory
from lbrynet.core.Error import InsufficientFundsError
@ -11,15 +9,15 @@ from lbrynet.core import utils
log = logging.getLogger(__name__)
class PeerConnectionHandler(object):
class PeerConnectionHandler:
def __init__(self, request_creators, factory):
self.request_creators = request_creators
self.factory = factory
self.connection = None
class ConnectionManager(object):
implements(interfaces.IConnectionManager)
class ConnectionManager:
#implements(interfaces.IConnectionManager)
MANAGE_CALL_INTERVAL_SEC = 5
TCP_CONNECT_TIMEOUT = 15
@ -98,7 +96,8 @@ class ConnectionManager(object):
d.addBoth(lambda _: disconnect_peer(p))
return d
closing_deferreds = [close_connection(peer) for peer in self._peer_connections.keys()]
# fixme: stop modifying dict during iteration
closing_deferreds = [close_connection(peer) for peer in list(self._peer_connections)]
return defer.DeferredList(closing_deferreds)
@defer.inlineCallbacks
@ -226,5 +225,3 @@ class ConnectionManager(object):
del self._connections_closing[peer]
d.callback(True)
return connection_was_made

View file

@ -1,14 +1,12 @@
import logging
from twisted.internet import defer
from zope.interface import implements
from lbrynet import interfaces
log = logging.getLogger(__name__)
class DownloadManager(object):
implements(interfaces.IDownloadManager)
class DownloadManager:
#implements(interfaces.IDownloadManager)
def __init__(self, blob_manager):
self.blob_manager = blob_manager
@ -81,14 +79,14 @@ class DownloadManager(object):
return self.blob_handler.handle_blob(self.blobs[blob_num], self.blob_infos[blob_num])
def calculate_total_bytes(self):
return sum([bi.length for bi in self.blob_infos.itervalues()])
return sum([bi.length for bi in self.blob_infos.values()])
def calculate_bytes_left_to_output(self):
if not self.blobs:
return self.calculate_total_bytes()
else:
to_be_outputted = [
b for n, b in self.blobs.iteritems()
b for n, b in self.blobs.items()
if n >= self.progress_manager.last_blob_outputted
]
return sum([b.length for b in to_be_outputted if b.length is not None])

View file

@ -1,6 +1,4 @@
import logging
from zope.interface import implements
from lbrynet import interfaces
from lbrynet.core.BlobInfo import BlobInfo
from lbrynet.core.client.BlobRequester import BlobRequester
from lbrynet.core.client.ConnectionManager import ConnectionManager
@ -14,8 +12,8 @@ from twisted.internet.task import LoopingCall
log = logging.getLogger(__name__)
class SingleBlobMetadataHandler(object):
implements(interfaces.IMetadataHandler)
class SingleBlobMetadataHandler:
#implements(interfaces.IMetadataHandler)
def __init__(self, blob_hash, download_manager):
self.blob_hash = blob_hash
@ -31,7 +29,7 @@ class SingleBlobMetadataHandler(object):
return 0
class SingleProgressManager(object):
class SingleProgressManager:
def __init__(self, download_manager, finished_callback, timeout_callback, timeout):
self.finished_callback = finished_callback
self.timeout_callback = timeout_callback
@ -71,10 +69,10 @@ class SingleProgressManager(object):
def needed_blobs(self):
blobs = self.download_manager.blobs
assert len(blobs) == 1
return [b for b in blobs.itervalues() if not b.get_is_verified()]
return [b for b in blobs.values() if not b.get_is_verified()]
class DummyBlobHandler(object):
class DummyBlobHandler:
def __init__(self):
pass
@ -82,7 +80,7 @@ class DummyBlobHandler(object):
pass
class StandaloneBlobDownloader(object):
class StandaloneBlobDownloader:
def __init__(self, blob_hash, blob_manager, peer_finder,
rate_limiter, payment_rate_manager, wallet,
timeout=None):

View file

@ -1,14 +1,12 @@
import logging
from lbrynet.interfaces import IProgressManager
from twisted.internet import defer
from zope.interface import implements
log = logging.getLogger(__name__)
class StreamProgressManager(object):
implements(IProgressManager)
class StreamProgressManager:
#implements(IProgressManager)
def __init__(self, finished_callback, blob_manager,
download_manager, delete_blob_after_finished=False):
@ -82,8 +80,8 @@ class StreamProgressManager(object):
class FullStreamProgressManager(StreamProgressManager):
def __init__(self, finished_callback, blob_manager,
download_manager, delete_blob_after_finished=False):
StreamProgressManager.__init__(self, finished_callback, blob_manager, download_manager,
delete_blob_after_finished)
super().__init__(finished_callback, blob_manager, download_manager,
delete_blob_after_finished)
self.outputting_d = None
######### IProgressManager #########
@ -103,15 +101,15 @@ class FullStreamProgressManager(StreamProgressManager):
if not blobs:
return 0
else:
for i in xrange(max(blobs.iterkeys())):
for i in range(max(blobs.keys())):
if self._done(i, blobs):
return i
return max(blobs.iterkeys()) + 1
return max(blobs.keys()) + 1
def needed_blobs(self):
blobs = self.download_manager.blobs
return [
b for n, b in blobs.iteritems()
b for n, b in blobs.items()
if not b.get_is_verified() and not n in self.provided_blob_nums
]

View file

@ -1,17 +0,0 @@
import os
from contextlib import contextmanager
@contextmanager
def get_read_handle(path):
"""
Get os independent read handle for a file
"""
if os.name == "nt":
file_mode = 'rb'
else:
file_mode = 'r'
read_handle = open(path, file_mode)
yield read_handle
read_handle.close()

View file

@ -14,7 +14,7 @@ from lbrynet.core import utils
class HTTPSHandler(logging.Handler):
def __init__(self, url, fqdn=False, localname=None, facility=None, cookies=None):
logging.Handler.__init__(self)
super().__init__()
self.url = url
self.fqdn = fqdn
self.localname = localname
@ -243,7 +243,7 @@ def configure_twisted():
observer.start()
class LoggerNameFilter(object):
class LoggerNameFilter:
"""Filter a log record based on its name.
Allows all info level and higher records to pass thru.

View file

@ -1,4 +1,4 @@
class LoopingCallManager(object):
class LoopingCallManager:
def __init__(self, calls=None):
self.calls = calls or {}
@ -15,6 +15,6 @@ class LoopingCallManager(object):
self.calls[name].stop()
def shutdown(self):
for lcall in self.calls.itervalues():
for lcall in self.calls.values():
if lcall.running:
lcall.stop()

View file

@ -1,14 +1,12 @@
import logging
from twisted.internet import defer
from zope.interface import implements
from lbrynet.interfaces import IQueryHandlerFactory, IQueryHandler
log = logging.getLogger(__name__)
class BlobAvailabilityHandlerFactory(object):
implements(IQueryHandlerFactory)
class BlobAvailabilityHandlerFactory:
# implements(IQueryHandlerFactory)
def __init__(self, blob_manager):
self.blob_manager = blob_manager
@ -26,8 +24,8 @@ class BlobAvailabilityHandlerFactory(object):
return "Blob Availability - blobs that are available to be uploaded"
class BlobAvailabilityHandler(object):
implements(IQueryHandler)
class BlobAvailabilityHandler:
#implements(IQueryHandler)
def __init__(self, blob_manager):
self.blob_manager = blob_manager

View file

@ -3,17 +3,15 @@ import logging
from twisted.internet import defer
from twisted.protocols.basic import FileSender
from twisted.python.failure import Failure
from zope.interface import implements
from lbrynet import analytics
from lbrynet.core.Offer import Offer
from lbrynet.interfaces import IQueryHandlerFactory, IQueryHandler, IBlobSender
log = logging.getLogger(__name__)
class BlobRequestHandlerFactory(object):
implements(IQueryHandlerFactory)
class BlobRequestHandlerFactory:
#implements(IQueryHandlerFactory)
def __init__(self, blob_manager, wallet, payment_rate_manager, analytics_manager):
self.blob_manager = blob_manager
@ -35,8 +33,8 @@ class BlobRequestHandlerFactory(object):
return "Blob Uploader - uploads blobs"
class BlobRequestHandler(object):
implements(IQueryHandler, IBlobSender)
class BlobRequestHandler:
#implements(IQueryHandler, IBlobSender)
PAYMENT_RATE_QUERY = 'blob_data_payment_rate'
BLOB_QUERY = 'requested_blob'
AVAILABILITY_QUERY = 'requested_blobs'

View file

@ -1,8 +1,7 @@
import logging
from twisted.internet import interfaces, error
from twisted.internet import error
from twisted.internet.protocol import Protocol, ServerFactory
from twisted.python import failure
from zope.interface import implements
from lbrynet.core.server.ServerRequestHandler import ServerRequestHandler
@ -24,7 +23,7 @@ class ServerProtocol(Protocol):
10) Pause/resume production when told by the rate limiter
"""
implements(interfaces.IConsumer)
#implements(interfaces.IConsumer)
#Protocol stuff

View file

@ -1,25 +1,23 @@
import json
import logging
from twisted.internet import interfaces, defer
from zope.interface import implements
from lbrynet.interfaces import IRequestHandler
from twisted.internet import defer
log = logging.getLogger(__name__)
class ServerRequestHandler(object):
class ServerRequestHandler:
"""This class handles requests from clients. It can upload blobs and
return request for information about more blobs that are
associated with streams.
"""
implements(interfaces.IPushProducer, interfaces.IConsumer, IRequestHandler)
#implements(interfaces.IPushProducer, interfaces.IConsumer, IRequestHandler)
def __init__(self, consumer):
self.consumer = consumer
self.production_paused = False
self.request_buff = ''
self.response_buff = ''
self.request_buff = b''
self.response_buff = b''
self.producer = None
self.request_received = False
self.CHUNK_SIZE = 2**14
@ -56,7 +54,7 @@ class ServerRequestHandler(object):
return
chunk = self.response_buff[:self.CHUNK_SIZE]
self.response_buff = self.response_buff[self.CHUNK_SIZE:]
if chunk == '':
if chunk == b'':
return
log.trace("writing %s bytes to the client", len(chunk))
self.consumer.write(chunk)
@ -101,7 +99,7 @@ class ServerRequestHandler(object):
self.request_buff = self.request_buff + data
msg = self.try_to_parse_request(self.request_buff)
if msg:
self.request_buff = ''
self.request_buff = b''
self._process_msg(msg)
else:
log.debug("Request buff not a valid json message")
@ -134,7 +132,7 @@ class ServerRequestHandler(object):
self._produce_more()
def send_response(self, msg):
m = json.dumps(msg)
m = json.dumps(msg).encode()
log.debug("Sending a response of length %s", str(len(m)))
log.debug("Response: %s", str(m))
self.response_buff = self.response_buff + m
@ -167,7 +165,7 @@ class ServerRequestHandler(object):
return True
ds = []
for query_handler, query_identifiers in self.query_handlers.iteritems():
for query_handler, query_identifiers in self.query_handlers.items():
queries = {q_i: msg[q_i] for q_i in query_identifiers if q_i in msg}
d = query_handler.handle_queries(queries)
d.addErrback(log_errors)

View file

@ -3,9 +3,9 @@ import json
import subprocess
import os
from urllib2 import urlopen, URLError
from six.moves.urllib import request
from six.moves.urllib.error import URLError
from lbryschema import __version__ as lbryschema_version
from lbryum import __version__ as LBRYUM_VERSION
from lbrynet import build_type, __version__ as lbrynet_version
from lbrynet.conf import ROOT_DIR
@ -18,9 +18,9 @@ def get_lbrynet_version():
return subprocess.check_output(
['git', '--git-dir='+git_dir, 'describe', '--dirty', '--always'],
stderr=devnull
).strip().lstrip('v')
).decode().strip().lstrip('v')
except (subprocess.CalledProcessError, OSError):
print "failed to get version from git"
print("failed to get version from git")
return lbrynet_version
@ -32,19 +32,21 @@ def get_platform(get_ip=True):
"os_release": platform.release(),
"os_system": platform.system(),
"lbrynet_version": get_lbrynet_version(),
"lbryum_version": LBRYUM_VERSION,
"lbryschema_version": lbryschema_version,
"build": build_type.BUILD, # CI server sets this during build step
}
if p["os_system"] == "Linux":
import distro
p["distro"] = distro.info()
p["desktop"] = os.environ.get('XDG_CURRENT_DESKTOP', 'Unknown')
try:
import distro
p["distro"] = distro.info()
p["desktop"] = os.environ.get('XDG_CURRENT_DESKTOP', 'Unknown')
except ModuleNotFoundError:
pass
# TODO: remove this from get_platform and add a get_external_ip function using treq
if get_ip:
try:
response = json.loads(urlopen("https://api.lbry.io/ip").read())
response = json.loads(request.urlopen("https://api.lbry.io/ip").read())
if not response['success']:
raise URLError("failed to get external ip")
p['ip'] = response['data']['ip']

View file

@ -1,4 +1,5 @@
import base64
import codecs
import datetime
import random
import socket
@ -62,9 +63,9 @@ def safe_stop_looping_call(looping_call):
def generate_id(num=None):
h = get_lbry_hash_obj()
if num is not None:
h.update(str(num))
h.update(str(num).encode())
else:
h.update(str(random.getrandbits(512)))
h.update(str(random.getrandbits(512)).encode())
return h.digest()
@ -88,15 +89,19 @@ def version_is_greater_than(a, b):
return pkg_resources.parse_version(a) > pkg_resources.parse_version(b)
def rot13(some_str):
return codecs.encode(some_str, 'rot_13')
def deobfuscate(obfustacated):
return base64.b64decode(obfustacated.decode('rot13'))
return base64.b64decode(rot13(obfustacated))
def obfuscate(plain):
return base64.b64encode(plain).encode('rot13')
return rot13(base64.b64encode(plain).decode())
def check_connection(server="lbry.io", port=80, timeout=2):
def check_connection(server="lbry.io", port=80, timeout=5):
"""Attempts to open a socket to server:port and returns True if successful."""
log.debug('Checking connection to %s:%s', server, port)
try:
@ -142,7 +147,7 @@ def get_sd_hash(stream_info):
get('source', {}).\
get('source')
if not result:
log.warn("Unable to get sd_hash")
log.warning("Unable to get sd_hash")
return result
@ -150,7 +155,7 @@ def json_dumps_pretty(obj, **kwargs):
return json.dumps(obj, sort_keys=True, indent=2, separators=(',', ': '), **kwargs)
class DeferredLockContextManager(object):
class DeferredLockContextManager:
def __init__(self, lock):
self._lock = lock
@ -166,7 +171,7 @@ def DeferredDict(d, consumeErrors=False):
keys = []
dl = []
response = {}
for k, v in d.iteritems():
for k, v in d.items():
keys.append(k)
dl.append(v)
results = yield defer.DeferredList(dl, consumeErrors=consumeErrors)
@ -176,7 +181,7 @@ def DeferredDict(d, consumeErrors=False):
defer.returnValue(response)
class DeferredProfiler(object):
class DeferredProfiler:
def __init__(self):
self.profile_results = {}

View file

@ -16,21 +16,21 @@ backend = default_backend()
class CryptBlobInfo(BlobInfo):
def __init__(self, blob_hash, blob_num, length, iv):
BlobInfo.__init__(self, blob_hash, blob_num, length)
super().__init__(blob_hash, blob_num, length)
self.iv = iv
def get_dict(self):
info = {
"blob_num": self.blob_num,
"length": self.length,
"iv": self.iv
"iv": self.iv.decode()
}
if self.blob_hash:
info['blob_hash'] = self.blob_hash
return info
class StreamBlobDecryptor(object):
class StreamBlobDecryptor:
def __init__(self, blob, key, iv, length):
"""
This class decrypts blob
@ -68,14 +68,14 @@ class StreamBlobDecryptor(object):
def write_bytes():
if self.len_read < self.length:
num_bytes_to_decrypt = greatest_multiple(len(self.buff), (AES.block_size / 8))
num_bytes_to_decrypt = greatest_multiple(len(self.buff), (AES.block_size // 8))
data_to_decrypt, self.buff = split(self.buff, num_bytes_to_decrypt)
write_func(self.cipher.update(data_to_decrypt))
def finish_decrypt():
bytes_left = len(self.buff) % (AES.block_size / 8)
bytes_left = len(self.buff) % (AES.block_size // 8)
if bytes_left != 0:
log.warning(self.buff[-1 * (AES.block_size / 8):].encode('hex'))
log.warning(self.buff[-1 * (AES.block_size // 8):].encode('hex'))
raise Exception("blob %s has incorrect padding: %i bytes left" %
(self.blob.blob_hash, bytes_left))
data_to_decrypt, self.buff = self.buff, b''
@ -99,7 +99,7 @@ class StreamBlobDecryptor(object):
return d
class CryptStreamBlobMaker(object):
class CryptStreamBlobMaker:
def __init__(self, key, iv, blob_num, blob):
"""
This class encrypts data and writes it to a new blob
@ -146,7 +146,7 @@ class CryptStreamBlobMaker(object):
def close(self):
log.debug("closing blob %s with plaintext len %s", str(self.blob_num), str(self.length))
if self.length != 0:
self.length += (AES.block_size / 8) - (self.length % (AES.block_size / 8))
self.length += (AES.block_size // 8) - (self.length % (AES.block_size // 8))
padded_data = self.padder.finalize()
encrypted_data = self.cipher.update(padded_data) + self.cipher.finalize()
self.blob.write(encrypted_data)

View file

@ -5,15 +5,14 @@ import os
import logging
from cryptography.hazmat.primitives.ciphers.algorithms import AES
from twisted.internet import interfaces, defer
from zope.interface import implements
from twisted.internet import defer
from lbrynet.cryptstream.CryptBlob import CryptStreamBlobMaker
log = logging.getLogger(__name__)
class CryptStreamCreator(object):
class CryptStreamCreator:
"""
Create a new stream with blobs encrypted by a symmetric cipher.
@ -22,7 +21,7 @@ class CryptStreamCreator(object):
the blob is associated with the stream.
"""
implements(interfaces.IConsumer)
#implements(interfaces.IConsumer)
def __init__(self, blob_manager, name=None, key=None, iv_generator=None):
"""@param blob_manager: Object that stores and provides access to blobs.
@ -101,13 +100,13 @@ class CryptStreamCreator(object):
@staticmethod
def random_iv_generator():
while 1:
yield os.urandom(AES.block_size / 8)
yield os.urandom(AES.block_size // 8)
def setup(self):
"""Create the symmetric key if it wasn't provided"""
if self.key is None:
self.key = os.urandom(AES.block_size / 8)
self.key = os.urandom(AES.block_size // 8)
return defer.succeed(True)
@ -122,7 +121,7 @@ class CryptStreamCreator(object):
yield defer.DeferredList(self.finished_deferreds)
self.blob_count += 1
iv = self.iv_generator.next()
iv = next(self.iv_generator)
final_blob = self._get_blob_maker(iv, self.blob_manager.get_blob_creator())
stream_terminator = yield final_blob.close()
terminator_info = yield self._blob_finished(stream_terminator)
@ -133,7 +132,7 @@ class CryptStreamCreator(object):
if self.current_blob is None:
self.next_blob_creator = self.blob_manager.get_blob_creator()
self.blob_count += 1
iv = self.iv_generator.next()
iv = next(self.iv_generator)
self.current_blob = self._get_blob_maker(iv, self.next_blob_creator)
done, num_bytes_written = self.current_blob.write(data)
data = data[num_bytes_written:]

View file

@ -1,12 +1,10 @@
import binascii
from zope.interface import implements
from twisted.internet import defer
from lbrynet.cryptstream.CryptBlob import StreamBlobDecryptor
from lbrynet.interfaces import IBlobHandler
class CryptBlobHandler(object):
implements(IBlobHandler)
class CryptBlobHandler:
#implements(IBlobHandler)
def __init__(self, key, write_func):
self.key = key

View file

@ -1,7 +1,5 @@
import binascii
from binascii import unhexlify
import logging
from zope.interface import implements
from lbrynet.interfaces import IStreamDownloader
from lbrynet.core.client.BlobRequester import BlobRequester
from lbrynet.core.client.ConnectionManager import ConnectionManager
from lbrynet.core.client.DownloadManager import DownloadManager
@ -34,9 +32,9 @@ class CurrentlyStartingError(Exception):
pass
class CryptStreamDownloader(object):
class CryptStreamDownloader:
implements(IStreamDownloader)
#implements(IStreamDownloader)
def __init__(self, peer_finder, rate_limiter, blob_manager, payment_rate_manager, wallet,
key, stream_name):
@ -62,8 +60,8 @@ class CryptStreamDownloader(object):
self.blob_manager = blob_manager
self.payment_rate_manager = payment_rate_manager
self.wallet = wallet
self.key = binascii.unhexlify(key)
self.stream_name = binascii.unhexlify(stream_name)
self.key = unhexlify(key)
self.stream_name = unhexlify(stream_name).decode()
self.completed = False
self.stopped = True
self.stopping = False

View file

@ -1,7 +1,7 @@
import logging
from twisted.internet import defer
from twisted._threads import AlreadyQuit
from ComponentManager import ComponentManager
from .ComponentManager import ComponentManager
log = logging.getLogger(__name__)
@ -14,7 +14,7 @@ class ComponentType(type):
return klass
class Component(object):
class Component(metaclass=ComponentType):
"""
lbrynet-daemon component helper
@ -22,7 +22,6 @@ class Component(object):
methods
"""
__metaclass__ = ComponentType
depends_on = []
component_name = None

View file

@ -6,7 +6,7 @@ from lbrynet.core.Error import ComponentStartConditionNotMet
log = logging.getLogger(__name__)
class RegisteredConditions(object):
class RegisteredConditions:
conditions = {}
@ -20,7 +20,7 @@ class RequiredConditionType(type):
return klass
class RequiredCondition(object):
class RequiredCondition(metaclass=RequiredConditionType):
name = ""
component = ""
message = ""
@ -29,10 +29,8 @@ class RequiredCondition(object):
def evaluate(component):
raise NotImplementedError()
__metaclass__ = RequiredConditionType
class ComponentManager(object):
class ComponentManager:
default_component_classes = {}
def __init__(self, reactor=None, analytics_manager=None, skip_components=None, **override_components):
@ -43,7 +41,7 @@ class ComponentManager(object):
self.components = set()
self.analytics_manager = analytics_manager
for component_name, component_class in self.default_component_classes.iteritems():
for component_name, component_class in self.default_component_classes.items():
if component_name in override_components:
component_class = override_components.pop(component_name)
if component_name not in self.skip_components:
@ -52,7 +50,7 @@ class ComponentManager(object):
if override_components:
raise SyntaxError("unexpected components: %s" % override_components)
for component_class in self.component_classes.itervalues():
for component_class in self.component_classes.values():
self.components.add(component_class(self))
@defer.inlineCallbacks
@ -117,7 +115,7 @@ class ComponentManager(object):
:return: (defer.Deferred)
"""
for component_name, cb in callbacks.iteritems():
for component_name, cb in callbacks.items():
if component_name not in self.component_classes:
raise NameError("unknown component: %s" % component_name)
if not callable(cb):
@ -132,7 +130,7 @@ class ComponentManager(object):
stages = self.sort_components()
for stage in stages:
yield defer.DeferredList([_setup(component) for component in stage])
yield defer.DeferredList([_setup(component) for component in stage if not component.running])
@defer.inlineCallbacks
def stop(self):

View file

@ -1,20 +1,21 @@
import os
import logging
from hashlib import sha256
import treq
import math
import binascii
from hashlib import sha256
from types import SimpleNamespace
from twisted.internet import defer, threads, reactor, error
import lbryschema
from txupnp.upnp import UPnP
from lbryum.simple_config import SimpleConfig
from lbryum.constants import HEADERS_URL, HEADER_SIZE
from lbrynet import conf
from lbrynet.core.utils import DeferredDict
from lbrynet.core.PaymentRateManager import OnlyFreePaymentsManager
from lbrynet.core.RateLimiter import RateLimiter
from lbrynet.core.BlobManager import DiskBlobManager
from lbrynet.core.StreamDescriptor import StreamDescriptorIdentifier, EncryptedFileStreamType
from lbrynet.core.Wallet import LBRYumWallet
from lbrynet.wallet.manager import LbryWalletManager
from lbrynet.wallet.network import Network
from lbrynet.core.server.BlobRequestHandler import BlobRequestHandlerFactory
from lbrynet.core.server.ServerProtocol import ServerProtocolFactory
from lbrynet.daemon.Component import Component
@ -25,7 +26,7 @@ from lbrynet.file_manager.EncryptedFileManager import EncryptedFileManager
from lbrynet.lbry_file.client.EncryptedFileDownloader import EncryptedFileSaverFactory
from lbrynet.lbry_file.client.EncryptedFileOptions import add_lbry_file_to_sd_identifier
from lbrynet.reflector import ServerFactory as reflector_server_factory
from lbrynet.txlbryum.factory import StratumClient
from lbrynet.core.utils import generate_id
log = logging.getLogger(__name__)
@ -68,7 +69,7 @@ def get_wallet_config():
return config
class ConfigSettings(object):
class ConfigSettings:
@staticmethod
def get_conf_setting(setting_name):
return conf.settings[setting_name]
@ -101,7 +102,7 @@ class DatabaseComponent(Component):
component_name = DATABASE_COMPONENT
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.storage = None
@property
@ -169,12 +170,18 @@ class DatabaseComponent(Component):
self.storage = None
HEADERS_URL = "https://headers.lbry.io/blockchain_headers_latest"
HEADER_SIZE = 112
class HeadersComponent(Component):
component_name = HEADERS_COMPONENT
def __init__(self, component_manager):
Component.__init__(self, component_manager)
self.config = SimpleConfig(get_wallet_config())
super().__init__(component_manager)
self.headers_dir = os.path.join(conf.settings['lbryum_wallet_dir'], 'lbc_mainnet')
self.headers_file = os.path.join(self.headers_dir, 'headers')
self.old_file = os.path.join(conf.settings['lbryum_wallet_dir'], 'blockchain_headers')
self._downloading_headers = None
self._headers_progress_percent = None
@ -190,19 +197,18 @@ class HeadersComponent(Component):
@defer.inlineCallbacks
def fetch_headers_from_s3(self):
local_header_size = self.local_header_file_size()
self._headers_progress_percent = 0.0
resume_header = {"Range": "bytes={}-".format(local_header_size)}
response = yield treq.get(HEADERS_URL, headers=resume_header)
final_size_after_download = response.length + local_header_size
def collector(data, h_file, start_size):
def collector(data, h_file):
h_file.write(data)
local_size = float(h_file.tell())
final_size = float(final_size_after_download)
self._headers_progress_percent = math.ceil((local_size - start_size) / (final_size - start_size) * 100)
self._headers_progress_percent = math.ceil(local_size / final_size * 100)
if response.code == 406: # our file is bigger
local_header_size = self.local_header_file_size()
resume_header = {"Range": "bytes={}-".format(local_header_size)}
response = yield treq.get(HEADERS_URL, headers=resume_header)
got_406 = response.code == 406 # our file is bigger
final_size_after_download = response.length + local_header_size
if got_406:
log.warning("s3 is more out of date than we are")
# should have something to download and a final length divisible by the header size
elif final_size_after_download and not final_size_after_download % HEADER_SIZE:
@ -211,11 +217,11 @@ class HeadersComponent(Component):
if s3_height > local_height:
if local_header_size:
log.info("Resuming download of %i bytes from s3", response.length)
with open(os.path.join(self.config.path, "blockchain_headers"), "a+b") as headers_file:
yield treq.collect(response, lambda d: collector(d, headers_file, local_header_size))
with open(self.headers_file, "a+b") as headers_file:
yield treq.collect(response, lambda d: collector(d, headers_file))
else:
with open(os.path.join(self.config.path, "blockchain_headers"), "wb") as headers_file:
yield treq.collect(response, lambda d: collector(d, headers_file, 0))
with open(self.headers_file, "wb") as headers_file:
yield treq.collect(response, lambda d: collector(d, headers_file))
log.info("fetched headers from s3 (s3 height: %i), now verifying integrity after download.", s3_height)
self._check_header_file_integrity()
else:
@ -227,20 +233,22 @@ class HeadersComponent(Component):
return max((self.local_header_file_size() / HEADER_SIZE) - 1, 0)
def local_header_file_size(self):
headers_path = os.path.join(self.config.path, "blockchain_headers")
if os.path.isfile(headers_path):
return os.stat(headers_path).st_size
if os.path.isfile(self.headers_file):
return os.stat(self.headers_file).st_size
return 0
@defer.inlineCallbacks
def get_remote_height(self, server, port):
connected = defer.Deferred()
connected.addTimeout(3, reactor, lambda *_: None)
client = StratumClient(connected)
reactor.connectTCP(server, port, client)
yield connected
remote_height = yield client.blockchain_block_get_server_height()
client.client.transport.loseConnection()
def get_remote_height(self):
ledger = SimpleNamespace()
ledger.config = {
'default_servers': conf.settings['lbryum_servers'],
'data_path': conf.settings['lbryum_wallet_dir']
}
net = Network(ledger)
net.start()
yield net.on_connected.first
remote_height = yield net.get_server_height()
yield net.stop()
defer.returnValue(remote_height)
@defer.inlineCallbacks
@ -252,15 +260,10 @@ class HeadersComponent(Component):
if not s3_headers_depth:
defer.returnValue(False)
local_height = self.local_header_file_height()
for server_url in self.config.get('default_servers'):
port = int(self.config.get('default_servers')[server_url]['t'])
try:
remote_height = yield self.get_remote_height(server_url, port)
log.info("%s:%i height: %i, local height: %s", server_url, port, remote_height, local_height)
if remote_height > (local_height + s3_headers_depth):
defer.returnValue(True)
except Exception as err:
log.warning("error requesting remote height from %s:%i - %s", server_url, port, err)
remote_height = yield self.get_remote_height()
log.info("remote height: %i, local height: %s", remote_height, local_height)
if remote_height > (local_height + s3_headers_depth):
defer.returnValue(True)
defer.returnValue(False)
def _check_header_file_integrity(self):
@ -272,22 +275,26 @@ class HeadersComponent(Component):
checksum_length_in_bytes = checksum_height * HEADER_SIZE
if self.local_header_file_size() < checksum_length_in_bytes:
return
headers_path = os.path.join(self.config.path, "blockchain_headers")
with open(headers_path, "rb") as headers_file:
with open(self.headers_file, "rb") as headers_file:
hashsum.update(headers_file.read(checksum_length_in_bytes))
current_checksum = hashsum.hexdigest()
if current_checksum != checksum:
msg = "Expected checksum {}, got {}".format(checksum, current_checksum)
log.warning("Wallet file corrupted, checksum mismatch. " + msg)
log.warning("Deleting header file so it can be downloaded again.")
os.unlink(headers_path)
os.unlink(self.headers_file)
elif (self.local_header_file_size() % HEADER_SIZE) != 0:
log.warning("Header file is good up to checkpoint height, but incomplete. Truncating to checkpoint.")
with open(headers_path, "rb+") as headers_file:
with open(self.headers_file, "rb+") as headers_file:
headers_file.truncate(checksum_length_in_bytes)
@defer.inlineCallbacks
def start(self):
if not os.path.exists(self.headers_dir):
os.mkdir(self.headers_dir)
if os.path.exists(self.old_file):
log.warning("Moving old headers from %s to %s.", self.old_file, self.headers_file)
os.rename(self.old_file, self.headers_file)
self._downloading_headers = yield self.should_download_headers_from_s3()
if self._downloading_headers:
try:
@ -306,7 +313,7 @@ class WalletComponent(Component):
depends_on = [DATABASE_COMPONENT, HEADERS_COMPONENT]
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.wallet = None
@property
@ -329,9 +336,11 @@ class WalletComponent(Component):
@defer.inlineCallbacks
def start(self):
log.info("Starting torba wallet")
storage = self.component_manager.get_component(DATABASE_COMPONENT)
config = get_wallet_config()
self.wallet = LBRYumWallet(storage, config)
lbryschema.BLOCKCHAIN_NAME = conf.settings['blockchain_name']
self.wallet = LbryWalletManager.from_lbrynet_config(conf.settings, storage)
self.wallet.old_db = storage
yield self.wallet.start()
@defer.inlineCallbacks
@ -345,7 +354,7 @@ class BlobComponent(Component):
depends_on = [DATABASE_COMPONENT, DHT_COMPONENT]
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.blob_manager = None
@property
@ -376,7 +385,7 @@ class DHTComponent(Component):
depends_on = [UPNP_COMPONENT]
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.dht_node = None
self.upnp_component = None
self.external_udp_port = None
@ -426,7 +435,7 @@ class HashAnnouncerComponent(Component):
depends_on = [DHT_COMPONENT, DATABASE_COMPONENT]
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.hash_announcer = None
@property
@ -454,7 +463,7 @@ class RateLimiterComponent(Component):
component_name = RATE_LIMITER_COMPONENT
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.rate_limiter = RateLimiter()
@property
@ -475,7 +484,7 @@ class StreamIdentifierComponent(Component):
depends_on = [DHT_COMPONENT, RATE_LIMITER_COMPONENT, BLOB_COMPONENT, DATABASE_COMPONENT, WALLET_COMPONENT]
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.sd_identifier = StreamDescriptorIdentifier()
@property
@ -509,7 +518,7 @@ class PaymentRateComponent(Component):
component_name = PAYMENT_RATE_COMPONENT
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.payment_rate_manager = OnlyFreePaymentsManager()
@property
@ -529,7 +538,7 @@ class FileManagerComponent(Component):
STREAM_IDENTIFIER_COMPONENT, PAYMENT_RATE_COMPONENT]
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.file_manager = None
@property
@ -569,7 +578,7 @@ class PeerProtocolServerComponent(Component):
PAYMENT_RATE_COMPONENT]
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.lbry_server_port = None
@property
@ -621,7 +630,7 @@ class ReflectorComponent(Component):
depends_on = [DHT_COMPONENT, BLOB_COMPONENT, FILE_MANAGER_COMPONENT]
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self.reflector_server_port = GCS('reflector_port')
self.reflector_server = None
@ -655,7 +664,7 @@ class UPnPComponent(Component):
component_name = UPNP_COMPONENT
def __init__(self, component_manager):
Component.__init__(self, component_manager)
super().__init__(component_manager)
self._int_peer_port = GCS('peer_port')
self._int_dht_node_port = GCS('dht_node_port')
self.use_upnp = GCS('use_upnp')

View file

@ -1,5 +1,3 @@
# coding=utf-8
import binascii
import logging.handlers
import mimetypes
import os
@ -7,12 +5,18 @@ import requests
import urllib
import json
import textwrap
from operator import itemgetter
from binascii import hexlify, unhexlify
from copy import deepcopy
from decimal import Decimal, InvalidOperation
from twisted.web import server
from twisted.internet import defer, reactor
from twisted.internet.task import LoopingCall
from twisted.python.failure import Failure
from typing import Union
from torba.constants import COIN
from lbryschema.claim import ClaimDict
from lbryschema.uri import parse_lbry_uri
@ -41,6 +45,8 @@ from lbrynet.dht.error import TimeoutError
from lbrynet.core.Peer import Peer
from lbrynet.core.SinglePeerDownloader import SinglePeerDownloader
from lbrynet.core.client.StandaloneBlobDownloader import StandaloneBlobDownloader
from lbrynet.wallet.account import Account as LBCAccount
from torba.baseaccount import SingleKey, HierarchicalDeterministic
log = logging.getLogger(__name__)
requires = AuthJSONRPCServer.requires
@ -75,7 +81,7 @@ DIRECTION_DESCENDING = 'desc'
DIRECTIONS = DIRECTION_ASCENDING, DIRECTION_DESCENDING
class IterableContainer(object):
class IterableContainer:
def __iter__(self):
for attr in dir(self):
if not attr.startswith("_"):
@ -88,7 +94,7 @@ class IterableContainer(object):
return False
class Checker(object):
class Checker:
"""The looping calls the daemon runs"""
INTERNET_CONNECTION = 'internet_connection_checker', 300
# CONNECTION_STATUS = 'connection_status_checker'
@ -120,7 +126,7 @@ class NoValidSearch(Exception):
pass
class CheckInternetConnection(object):
class CheckInternetConnection:
def __init__(self, daemon):
self.daemon = daemon
@ -128,7 +134,7 @@ class CheckInternetConnection(object):
self.daemon.connected_to_internet = utils.check_connection()
class AlwaysSend(object):
class AlwaysSend:
def __init__(self, value_generator, *args, **kwargs):
self.value_generator = value_generator
self.args = args
@ -176,7 +182,9 @@ class WalletIsLocked(RequiredCondition):
@staticmethod
def evaluate(component):
return component.check_locked()
d = component.check_locked()
d.addCallback(lambda r: not r)
return d
class Daemon(AuthJSONRPCServer):
@ -230,6 +238,13 @@ class Daemon(AuthJSONRPCServer):
# TODO: delete this
self.streams = {}
@property
def ledger(self):
try:
return self.wallet.default_account.ledger
except AttributeError:
return None
@defer.inlineCallbacks
def setup(self):
log.info("Starting lbrynet-daemon")
@ -239,7 +254,7 @@ class Daemon(AuthJSONRPCServer):
def _stop_streams(self):
"""stop pending GetStream downloads"""
for sd_hash, stream in self.streams.iteritems():
for sd_hash, stream in self.streams.items():
stream.cancel(reason="daemon shutdown")
def _shutdown(self):
@ -269,7 +284,7 @@ class Daemon(AuthJSONRPCServer):
@defer.inlineCallbacks
def _get_stream_analytics_report(self, claim_dict):
sd_hash = claim_dict.source_hash
sd_hash = claim_dict.source_hash.decode()
try:
stream_hash = yield self.storage.get_stream_hash_for_sd_hash(sd_hash)
except Exception:
@ -348,49 +363,39 @@ class Daemon(AuthJSONRPCServer):
log.error('Failed to get %s (%s)', name, err)
if self.streams[sd_hash].downloader and self.streams[sd_hash].code != 'running':
yield self.streams[sd_hash].downloader.stop(err)
result = {'error': err.message}
result = {'error': str(err)}
finally:
del self.streams[sd_hash]
defer.returnValue(result)
@defer.inlineCallbacks
def _publish_stream(self, name, bid, claim_dict, file_path=None, certificate_id=None,
def _publish_stream(self, name, bid, claim_dict, file_path=None, certificate=None,
claim_address=None, change_address=None):
publisher = Publisher(
self.blob_manager, self.payment_rate_manager, self.storage, self.file_manager, self.wallet, certificate_id
self.blob_manager, self.payment_rate_manager, self.storage, self.file_manager, self.wallet, certificate
)
parse_lbry_uri(name)
if not file_path:
stream_hash = yield self.storage.get_stream_hash_for_sd_hash(
claim_dict['stream']['source']['source'])
claim_out = yield publisher.publish_stream(name, bid, claim_dict, stream_hash, claim_address,
change_address)
tx = yield publisher.publish_stream(name, bid, claim_dict, stream_hash, claim_address)
else:
claim_out = yield publisher.create_and_publish_stream(name, bid, claim_dict, file_path,
claim_address, change_address)
tx = yield publisher.create_and_publish_stream(name, bid, claim_dict, file_path, claim_address)
if conf.settings['reflect_uploads']:
d = reupload.reflect_file(publisher.lbry_file)
d.addCallbacks(lambda _: log.info("Reflected new publication to lbry://%s", name),
log.exception)
self.analytics_manager.send_claim_action('publish')
log.info("Success! Published to lbry://%s txid: %s nout: %d", name, claim_out['txid'],
claim_out['nout'])
defer.returnValue(claim_out)
@defer.inlineCallbacks
def _resolve_name(self, name, force_refresh=False):
"""Resolves a name. Checks the cache first before going out to the blockchain.
Args:
name: the lbry://<name> to resolve
force_refresh: if True, always go out to the blockchain to resolve.
"""
parsed = parse_lbry_uri(name)
resolution = yield self.wallet.resolve(parsed.name, check_cache=not force_refresh)
if parsed.name in resolution:
result = resolution[parsed.name]
defer.returnValue(result)
nout = 0
txo = tx.outputs[nout]
log.info("Success! Published to lbry://%s txid: %s nout: %d", name, tx.id, nout)
defer.returnValue({
"success": True,
"tx": tx,
"claim_id": txo.claim_id,
"claim_address": self.ledger.hash160_to_address(txo.script.values['pubkey_hash']),
"output": tx.outputs[nout]
})
def _get_or_download_sd_blob(self, blob, sd_hash):
if blob:
@ -482,7 +487,7 @@ class Daemon(AuthJSONRPCServer):
Resolve a name and return the estimated stream cost
"""
resolved = yield self.wallet.resolve(uri)
resolved = (yield self.wallet.resolve(uri))[uri]
if resolved:
claim_response = resolved[uri]
else:
@ -510,7 +515,7 @@ class Daemon(AuthJSONRPCServer):
@defer.inlineCallbacks
def _get_lbry_file_dict(self, lbry_file, full_status=False):
key = binascii.b2a_hex(lbry_file.key) if lbry_file.key else None
key = hexlify(lbry_file.key) if lbry_file.key else None
full_path = os.path.join(lbry_file.download_directory, lbry_file.file_name)
mime_type = mimetypes.guess_type(full_path)[0]
if os.path.isfile(full_path):
@ -772,7 +777,6 @@ class Daemon(AuthJSONRPCServer):
log.info("Get version info: " + json.dumps(platform_info))
return self._render_response(platform_info)
# @AuthJSONRPCServer.deprecated() # deprecated actually disables the call
def jsonrpc_report_bug(self, message=None):
"""
Report a bug to slack
@ -883,12 +887,12 @@ class Daemon(AuthJSONRPCServer):
'auto_renew_claim_height_delta': int
}
for key, setting_type in setting_types.iteritems():
for key, setting_type in setting_types.items():
if key in new_settings:
if isinstance(new_settings[key], setting_type):
conf.settings.update({key: new_settings[key]},
data_types=(conf.TYPE_RUNTIME, conf.TYPE_PERSISTED))
elif setting_type is dict and isinstance(new_settings[key], (unicode, str)):
elif setting_type is dict and isinstance(new_settings[key], str):
decoded = json.loads(str(new_settings[key]))
conf.settings.update({key: decoded},
data_types=(conf.TYPE_RUNTIME, conf.TYPE_PERSISTED))
@ -948,6 +952,7 @@ class Daemon(AuthJSONRPCServer):
return self._render_response(sorted([command for command in self.callable_methods.keys()]))
@requires(WALLET_COMPONENT)
@defer.inlineCallbacks
def jsonrpc_wallet_balance(self, address=None, include_unconfirmed=False):
"""
Return the balance of the wallet
@ -963,11 +968,12 @@ class Daemon(AuthJSONRPCServer):
Returns:
(float) amount of lbry credits in wallet
"""
if address is None:
return self._render_response(float(self.wallet.get_balance()))
else:
return self._render_response(float(
self.wallet.get_address_balance(address, include_unconfirmed)))
if address is not None:
raise NotImplementedError("Limiting by address needs to be re-implemented in new wallet.")
dewies = yield self.wallet.default_account.get_balance(
0 if include_unconfirmed else 6
)
defer.returnValue(round(dewies / COIN, 3))
@requires(WALLET_COMPONENT)
@defer.inlineCallbacks
@ -997,7 +1003,6 @@ class Daemon(AuthJSONRPCServer):
defer.returnValue(response)
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks
def jsonrpc_wallet_decrypt(self):
"""
Decrypt an encrypted wallet, this will remove the wallet password
@ -1011,13 +1016,9 @@ class Daemon(AuthJSONRPCServer):
Returns:
(bool) true if wallet is decrypted, otherwise false
"""
result = self.wallet.decrypt_wallet()
response = yield self._render_response(result)
defer.returnValue(response)
return defer.succeed(self.wallet.decrypt_wallet())
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks
def jsonrpc_wallet_encrypt(self, new_password):
"""
Encrypt a wallet with a password, if the wallet is already encrypted this will update
@ -1032,12 +1033,10 @@ class Daemon(AuthJSONRPCServer):
Returns:
(bool) true if wallet is decrypted, otherwise false
"""
self.wallet.encrypt_wallet(new_password)
response = yield self._render_response(self.wallet.wallet.use_encryption)
defer.returnValue(response)
return defer.succeed(self.wallet.encrypt_wallet(new_password))
@defer.inlineCallbacks
@AuthJSONRPCServer.deprecated("stop")
def jsonrpc_daemon_stop(self):
"""
Stop lbrynet-daemon
@ -1051,11 +1050,24 @@ class Daemon(AuthJSONRPCServer):
Returns:
(string) Shutdown message
"""
return self.jsonrpc_stop()
def jsonrpc_stop(self):
"""
Stop lbrynet
Usage:
stop
Options:
None
Returns:
(string) Shutdown message
"""
log.info("Shutting down lbrynet daemon")
response = yield self._render_response("Shutting down")
reactor.callLater(0.1, reactor.fireSystemEvent, "shutdown")
defer.returnValue(response)
defer.returnValue("Shutting down")
@requires(FILE_MANAGER_COMPONENT)
@defer.inlineCallbacks
@ -1148,7 +1160,10 @@ class Daemon(AuthJSONRPCServer):
"""
try:
metadata = yield self._resolve_name(name, force_refresh=force)
name = parse_lbry_uri(name).name
metadata = yield self.wallet.resolve(name, check_cache=not force)
if name in metadata:
metadata = metadata[name]
except UnknownNameError:
log.info('Name %s is not known', name)
defer.returnValue(None)
@ -1361,7 +1376,7 @@ class Daemon(AuthJSONRPCServer):
resolved = resolved['claim']
txid, nout, name = resolved['txid'], resolved['nout'], resolved['name']
claim_dict = ClaimDict.load_dict(resolved['value'])
sd_hash = claim_dict.source_hash
sd_hash = claim_dict.source_hash.decode()
if sd_hash in self.streams:
log.info("Already waiting on lbry://%s to start downloading", name)
@ -1532,7 +1547,6 @@ class Daemon(AuthJSONRPCServer):
'claim_id' : (str) claim ID of the resulting claim
}
"""
try:
parsed = parse_lbry_uri(channel_name)
if not parsed.is_channel:
@ -1541,29 +1555,24 @@ class Daemon(AuthJSONRPCServer):
raise Exception("Invalid channel uri")
except (TypeError, URIParseError):
raise Exception("Invalid channel name")
amount = self.get_dewies_or_error("amount", amount)
if amount <= 0:
raise Exception("Invalid amount")
yield self.wallet.update_balance()
if amount >= self.wallet.get_balance():
balance = yield self.wallet.get_max_usable_balance_for_claim(channel_name)
max_bid_amount = balance - MAX_UPDATE_FEE_ESTIMATE
if balance <= MAX_UPDATE_FEE_ESTIMATE:
raise InsufficientFundsError(
"Insufficient funds, please deposit additional LBC. Minimum additional LBC needed {}"
.format(MAX_UPDATE_FEE_ESTIMATE - balance))
elif amount > max_bid_amount:
raise InsufficientFundsError(
"Please wait for any pending bids to resolve or lower the bid value. "
"Currently the maximum amount you can specify for this channel is {}"
.format(max_bid_amount)
)
result = yield self.wallet.claim_new_channel(channel_name, amount)
tx = yield self.wallet.claim_new_channel(channel_name, amount)
self.wallet.save()
self.analytics_manager.send_new_channel()
log.info("Claimed a new channel! Result: %s", result)
response = yield self._render_response(result)
defer.returnValue(response)
nout = 0
txo = tx.outputs[nout]
log.info("Claimed a new channel! lbry://%s txid: %s nout: %d", channel_name, tx.id, nout)
defer.returnValue({
"success": True,
"tx": tx,
"claim_id": txo.claim_id,
"claim_address": self.ledger.hash160_to_address(txo.script.values['pubkey_hash']),
"output": txo
})
@requires(WALLET_COMPONENT)
@defer.inlineCallbacks
@ -1735,23 +1744,28 @@ class Daemon(AuthJSONRPCServer):
if bid <= 0.0:
raise ValueError("Bid value must be greater than 0.0")
bid = int(bid * COIN)
for address in [claim_address, change_address]:
if address is not None:
# raises an error if the address is invalid
decode_address(address)
yield self.wallet.update_balance()
if bid >= self.wallet.get_balance():
balance = yield self.wallet.get_max_usable_balance_for_claim(name)
max_bid_amount = balance - MAX_UPDATE_FEE_ESTIMATE
if balance <= MAX_UPDATE_FEE_ESTIMATE:
raise InsufficientFundsError(
"Insufficient funds, please deposit additional LBC. Minimum additional LBC needed {}"
.format(MAX_UPDATE_FEE_ESTIMATE - balance))
elif bid > max_bid_amount:
raise InsufficientFundsError(
"Please lower the bid value, the maximum amount you can specify for this claim is {}."
.format(max_bid_amount))
available = yield self.wallet.default_account.get_balance()
if bid >= available:
# TODO: add check for existing claim balance
#balance = yield self.wallet.get_max_usable_balance_for_claim(name)
#max_bid_amount = balance - MAX_UPDATE_FEE_ESTIMATE
#if balance <= MAX_UPDATE_FEE_ESTIMATE:
raise InsufficientFundsError(
"Insufficient funds, please deposit additional LBC. Minimum additional LBC needed {}"
.format(round((bid - available)/COIN + 0.01, 2))
)
# .format(MAX_UPDATE_FEE_ESTIMATE - balance))
#elif bid > max_bid_amount:
# raise InsufficientFundsError(
# "Please lower the bid value, the maximum amount you can specify for this claim is {}."
# .format(max_bid_amount))
metadata = metadata or {}
if fee is not None:
@ -1789,7 +1803,7 @@ class Daemon(AuthJSONRPCServer):
log.warning("Stripping empty fee from published metadata")
del metadata['fee']
elif 'address' not in metadata['fee']:
address = yield self.wallet.get_least_used_address()
address = yield self.wallet.default_account.receiving.get_or_create_usable_address()
metadata['fee']['address'] = address
if 'fee' in metadata and 'version' not in metadata['fee']:
metadata['fee']['version'] = '_0_0_1'
@ -1841,24 +1855,19 @@ class Daemon(AuthJSONRPCServer):
'channel_name': channel_name
})
if channel_id:
certificate_id = channel_id
elif channel_name:
certificate_id = None
my_certificates = yield self.wallet.channel_list()
for certificate in my_certificates:
if channel_name == certificate['name']:
certificate_id = certificate['claim_id']
certificate = None
if channel_name:
certificates = yield self.wallet.get_certificates(channel_name)
for cert in certificates:
if cert.claim_id == channel_id:
certificate = cert
break
if not certificate_id:
if certificate is None:
raise Exception("Cannot publish using channel %s" % channel_name)
else:
certificate_id = None
result = yield self._publish_stream(name, bid, claim_dict, file_path, certificate_id,
result = yield self._publish_stream(name, bid, claim_dict, file_path, certificate,
claim_address, change_address)
response = yield self._render_response(result)
defer.returnValue(response)
defer.returnValue(result)
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks
@ -1889,9 +1898,13 @@ class Daemon(AuthJSONRPCServer):
if nout is None and txid is not None:
raise Exception('Must specify nout')
result = yield self.wallet.abandon_claim(claim_id, txid, nout)
tx = yield self.wallet.abandon_claim(claim_id, txid, nout)
self.analytics_manager.send_claim_action('abandon')
defer.returnValue(result)
defer.returnValue({
"success": True,
"tx": tx,
"claim_id": claim_id
})
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks
@ -2148,8 +2161,7 @@ class Daemon(AuthJSONRPCServer):
except URIParseError:
results[chan_uri] = {"error": "%s is not a valid uri" % chan_uri}
resolved = yield self.wallet.resolve(*valid_uris, check_cache=False, page=page,
page_size=page_size)
resolved = yield self.wallet.resolve(*valid_uris, page=page, page_size=page_size)
for u in resolved:
if 'error' in resolved[u]:
results[u] = resolved[u]
@ -2345,6 +2357,7 @@ class Daemon(AuthJSONRPCServer):
"""
def _disp(address):
address = str(address)
log.info("Got unused wallet address: " + address)
return defer.succeed(address)
@ -2353,36 +2366,6 @@ class Daemon(AuthJSONRPCServer):
d.addCallback(lambda address: self._render_response(address))
return d
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@AuthJSONRPCServer.deprecated("wallet_send")
@defer.inlineCallbacks
def jsonrpc_send_amount_to_address(self, amount, address):
"""
Queue a payment of credits to an address
Usage:
send_amount_to_address (<amount> | --amount=<amount>) (<address> | --address=<address>)
Options:
--amount=<amount> : (float) amount to send
--address=<address> : (str) address to send credits to
Returns:
(bool) true if payment successfully scheduled
"""
if amount < 0:
raise NegativeFundsError()
elif not amount:
raise NullFundsError()
reserved_points = self.wallet.reserve_points(address, amount)
if reserved_points is None:
raise InsufficientFundsError()
yield self.wallet.send_points_to_address(reserved_points, amount)
self.analytics_manager.send_credits_sent()
defer.returnValue(True)
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks
def jsonrpc_wallet_send(self, amount, address=None, claim_id=None):
@ -2402,7 +2385,16 @@ class Daemon(AuthJSONRPCServer):
Returns:
If sending to an address:
(bool) true if payment successfully scheduled
(dict) true if payment successfully scheduled
{
"hex": (str) raw transaction,
"inputs": (list) inputs(dict) used for the transaction,
"outputs": (list) outputs(dict) for the transaction,
"total_fee": (int) fee in dewies,
"total_input": (int) total of inputs in dewies,
"total_output": (int) total of outputs in dewies(input - fees),
"txid": (str) txid of the transaction,
}
If sending a claim tip:
(dict) Dictionary containing the result of the support
@ -2413,25 +2405,26 @@ class Daemon(AuthJSONRPCServer):
}
"""
amount = self.get_dewies_or_error("amount", amount)
if not amount:
raise NullFundsError
elif amount < 0:
raise NegativeFundsError()
if address and claim_id:
raise Exception("Given both an address and a claim id")
elif not address and not claim_id:
raise Exception("Not given an address or a claim id")
try:
amount = Decimal(str(amount))
except InvalidOperation:
raise TypeError("Amount does not represent a valid decimal.")
if amount < 0:
raise NegativeFundsError()
elif not amount:
raise NullFundsError()
if address:
# raises an error if the address is invalid
decode_address(address)
result = yield self.jsonrpc_send_amount_to_address(amount, address)
reserved_points = self.wallet.reserve_points(address, amount)
if reserved_points is None:
raise InsufficientFundsError()
result = yield self.wallet.send_points_to_address(reserved_points, amount)
self.analytics_manager.send_credits_sent()
else:
validate_claim_id(claim_id)
result = yield self.wallet.tip_claim(claim_id, amount)
@ -2442,7 +2435,7 @@ class Daemon(AuthJSONRPCServer):
@defer.inlineCallbacks
def jsonrpc_wallet_prefill_addresses(self, num_addresses, amount, no_broadcast=False):
"""
Create new addresses, each containing `amount` credits
Create new UTXOs, each containing `amount` credits
Usage:
wallet_prefill_addresses [--no_broadcast]
@ -2457,17 +2450,12 @@ class Daemon(AuthJSONRPCServer):
Returns:
(dict) the resulting transaction
"""
if amount < 0:
raise NegativeFundsError()
elif not amount:
raise NullFundsError()
broadcast = not no_broadcast
tx = yield self.wallet.create_addresses_with_balance(
num_addresses, amount, broadcast=broadcast)
tx['broadcast'] = broadcast
defer.returnValue(tx)
return self.jsonrpc_fund(self.wallet.default_account.name,
self.wallet.default_account.name,
amount=amount,
outputs=num_addresses,
broadcast=broadcast)
@requires(WALLET_COMPONENT)
@defer.inlineCallbacks
@ -2628,7 +2616,7 @@ class Daemon(AuthJSONRPCServer):
if not utils.is_valid_blobhash(blob_hash):
raise Exception("invalid blob hash")
finished_deferred = self.dht_node.iterativeFindValue(binascii.unhexlify(blob_hash))
finished_deferred = self.dht_node.iterativeFindValue(unhexlify(blob_hash))
def trap_timeout(err):
err.trap(defer.TimeoutError)
@ -2639,7 +2627,7 @@ class Daemon(AuthJSONRPCServer):
peers = yield finished_deferred
results = [
{
"node_id": node_id.encode('hex'),
"node_id": hexlify(node_id).decode(),
"host": host,
"port": port
}
@ -2748,7 +2736,7 @@ class Daemon(AuthJSONRPCServer):
"""
if uri or stream_hash or sd_hash:
if uri:
metadata = yield self._resolve_name(uri)
metadata = (yield self.wallet.resolve(uri))[uri]
sd_hash = utils.get_sd_hash(metadata)
stream_hash = yield self.storage.get_stream_hash_for_sd_hash(sd_hash)
elif stream_hash:
@ -2768,7 +2756,7 @@ class Daemon(AuthJSONRPCServer):
if sd_hash in self.blob_manager.blobs:
blobs = [self.blob_manager.blobs[sd_hash]] + blobs
else:
blobs = self.blob_manager.blobs.itervalues()
blobs = self.blob_manager.blobs.values()
if needed:
blobs = [blob for blob in blobs if not blob.get_is_verified()]
@ -2844,21 +2832,21 @@ class Daemon(AuthJSONRPCServer):
contact = None
if node_id and address and port:
contact = self.dht_node.contact_manager.get_contact(node_id.decode('hex'), address, int(port))
contact = self.dht_node.contact_manager.get_contact(unhexlify(node_id), address, int(port))
if not contact:
contact = self.dht_node.contact_manager.make_contact(
node_id.decode('hex'), address, int(port), self.dht_node._protocol
unhexlify(node_id), address, int(port), self.dht_node._protocol
)
if not contact:
try:
contact = yield self.dht_node.findContact(node_id.decode('hex'))
contact = yield self.dht_node.findContact(unhexlify(node_id))
except TimeoutError:
result = {'error': 'timeout finding peer'}
defer.returnValue(result)
if not contact:
defer.returnValue({'error': 'peer not found'})
try:
result = yield contact.ping()
result = (yield contact.ping()).decode()
except TimeoutError:
result = {'error': 'ping timeout'}
defer.returnValue(result)
@ -2892,51 +2880,34 @@ class Daemon(AuthJSONRPCServer):
"node_id": (str) the local dht node id
}
"""
result = {}
data_store = self.dht_node._dataStore._dict
datastore_len = len(data_store)
data_store = self.dht_node._dataStore
hosts = {}
if datastore_len:
for k, v in data_store.iteritems():
for contact, value, lastPublished, originallyPublished, originalPublisherID in v:
if contact in hosts:
blobs = hosts[contact]
else:
blobs = []
blobs.append(k.encode('hex'))
hosts[contact] = blobs
for k, v in data_store.items():
for contact in map(itemgetter(0), v):
hosts.setdefault(contact, []).append(hexlify(k).decode())
contact_set = []
blob_hashes = []
contact_set = set()
blob_hashes = set()
result['buckets'] = {}
for i in range(len(self.dht_node._routingTable._buckets)):
for contact in self.dht_node._routingTable._buckets[i]._contacts:
contacts = result['buckets'].get(i, [])
if contact in hosts:
blobs = hosts[contact]
del hosts[contact]
else:
blobs = []
blobs = list(hosts.pop(contact)) if contact in hosts else []
blob_hashes.update(blobs)
host = {
"address": contact.address,
"port": contact.port,
"node_id": contact.id.encode("hex"),
"node_id": hexlify(contact.id).decode(),
"blobs": blobs,
}
for blob_hash in blobs:
if blob_hash not in blob_hashes:
blob_hashes.append(blob_hash)
contacts.append(host)
result['buckets'][i] = contacts
if contact.id.encode('hex') not in contact_set:
contact_set.append(contact.id.encode("hex"))
result['buckets'].setdefault(i, []).append(host)
contact_set.add(hexlify(contact.id).decode())
result['contacts'] = contact_set
result['blob_hashes'] = blob_hashes
result['node_id'] = self.dht_node.node_id.encode('hex')
result['contacts'] = list(contact_set)
result['blob_hashes'] = list(blob_hashes)
result['node_id'] = hexlify(self.dht_node.node_id).decode()
return self._render_response(result)
# the single peer downloader needs wallet access
@ -3039,7 +3010,7 @@ class Daemon(AuthJSONRPCServer):
}
try:
resolved_result = yield self.wallet.resolve(uri)
resolved_result = (yield self.wallet.resolve(uri))[uri]
response['did_resolve'] = True
except UnknownNameError:
response['error'] = "Failed to resolve name"
@ -3089,29 +3060,245 @@ class Daemon(AuthJSONRPCServer):
response['head_blob_availability'].get('is_available')
defer.returnValue(response)
@defer.inlineCallbacks
def jsonrpc_cli_test_command(self, pos_arg, pos_args=[], pos_arg2=None, pos_arg3=None,
a_arg=False, b_arg=False):
#######################
# New Wallet Commands #
#######################
# TODO:
# Delete this after all commands have been migrated
# and refactored.
@requires("wallet")
def jsonrpc_account(self, account_name, create=False, delete=False, single_key=False,
seed=None, private_key=None, public_key=None,
change_gap=None, change_max_uses=None,
receiving_gap=None, receiving_max_uses=None,
rename=None, default=False):
"""
This command is only for testing the CLI argument parsing
Create new account or update some settings on an existing account. If no
creation or modification options are provided but the account exists then
it will just displayed the unmodified settings for the account.
Usage:
cli_test_command [--a_arg] [--b_arg] (<pos_arg> | --pos_arg=<pos_arg>)
[<pos_args>...] [--pos_arg2=<pos_arg2>]
[--pos_arg3=<pos_arg3>]
account [--create | --delete] (<account_name> | --account_name=<account_name>) [--single_key]
[--seed=<seed> | --private_key=<private_key> | --public_key=<public_key>]
[--change_gap=<change_gap>] [--change_max_uses=<change_max_uses>]
[--receiving_gap=<receiving_gap>] [--receiving_max_uses=<receiving_max_uses>]
[--rename=<rename>] [--default]
Options:
--a_arg : (bool) a arg
--b_arg : (bool) b arg
--pos_arg=<pos_arg> : (int) pos arg
--pos_args=<pos_args> : (int) pos args
--pos_arg2=<pos_arg2> : (int) pos arg 2
--pos_arg3=<pos_arg3> : (int) pos arg 3
--account_name=<account_name> : (str) name of the account to create or update
--create : (bool) create the account
--delete : (bool) delete the account
--single_key : (bool) create single key account, default is multi-key
--seed=<seed> : (str) seed to generate new account from
--private_key=<private_key> : (str) private key for new account
--public_key=<public_key> : (str) public key for new account
--receiving_gap=<receiving_gap> : (int) set the gap for receiving addresses
--receiving_max_uses=<receiving_max_uses> : (int) set the maximum number of times to
use a receiving address
--change_gap=<change_gap> : (int) set the gap for change addresses
--change_max_uses=<change_max_uses> : (int) set the maximum number of times to
use a change address
--rename=<rename> : (str) change name of existing account
--default : (bool) make this account the default
Returns:
pos args
(map) new or updated account details
"""
out = (pos_arg, pos_args, pos_arg2, pos_arg3, a_arg, b_arg)
response = yield self._render_response(out)
defer.returnValue(response)
wallet = self.wallet.default_wallet
if create:
self.error_if_account_exists(account_name)
if single_key:
address_generator = {'name': SingleKey.name}
else:
address_generator = {
'name': HierarchicalDeterministic.name,
'receiving': {
'gap': receiving_gap or 20,
'maximum_uses_per_address': receiving_max_uses or 1},
'change': {
'gap': change_gap or 6,
'maximum_uses_per_address': change_max_uses or 1}
}
ledger = self.wallet.get_or_create_ledger('lbc_mainnet')
if seed or private_key or public_key:
account = LBCAccount.from_dict(ledger, wallet, {
'name': account_name,
'seed': seed,
'private_key': private_key,
'public_key': public_key,
'address_generator': address_generator
})
else:
account = LBCAccount.generate(
ledger, wallet, account_name, address_generator)
wallet.save()
elif delete:
account = self.get_account_or_error('account_name', account_name)
wallet.accounts.remove(account)
wallet.save()
return "Account '{}' deleted.".format(account_name)
else:
change_made = False
account = self.get_account_or_error('account_name', account_name)
if rename is not None:
self.error_if_account_exists(rename)
account.name = rename
change_made = True
if account.receiving.name == HierarchicalDeterministic.name:
address_changes = {
'change': {'gap': change_gap, 'maximum_uses_per_address': change_max_uses},
'receiving': {'gap': receiving_gap, 'maximum_uses_per_address': receiving_max_uses},
}
for chain_name in address_changes:
chain = getattr(account, chain_name)
for attr, value in address_changes[chain_name].items():
if value is not None:
setattr(chain, attr, value)
change_made = True
if change_made:
wallet.save()
if default:
wallet.accounts.remove(account)
wallet.accounts.insert(0, account)
wallet.save()
result = account.to_dict()
result.pop('certificates', None)
result['is_default'] = wallet.accounts[0] == account
return result
@requires("wallet")
def jsonrpc_balance(self, account_name=None, confirmations=6, include_reserved=False,
include_claims=False):
"""
Return the balance of an individual account or all of the accounts.
Usage:
balance [<account_name>] [--confirmations=<confirmations>]
[--include_reserved] [--include_claims]
Options:
--account=<account_name> : (str) If provided only the balance for this
account will be given
--confirmations=<confirmations> : (int) required confirmations (default: 6)
--include_reserved : (bool) include reserved UTXOs (default: false)
--include_claims : (bool) include claims, requires than a
LBC account is specified (default: false)
Returns:
(map) balance of account(s)
"""
if account_name:
for account in self.wallet.accounts:
if account.name == account_name:
if include_claims and not isinstance(account, LBCAccount):
raise Exception(
"'--include-claims' requires specifying an LBC ledger account. "
"Found '{}', but it's an {} ledger account."
.format(account_name, account.ledger.symbol)
)
args = {
'confirmations': confirmations,
'include_reserved': include_reserved
}
if include_claims:
args['include_claims'] = True
return account.get_balance(**args)
raise Exception("Couldn't find an account named: '{}'.".format(account_name))
else:
if include_claims:
raise Exception("'--include-claims' requires specifying an LBC account.")
return self.wallet.get_balances(confirmations)
@requires("wallet")
def jsonrpc_max_address_gap(self, account_name):
"""
Finds ranges of consecutive addresses that are unused and returns the length
of the longest such range: for change and receiving address chains. This is
useful to figure out ideal values to set for 'receiving_gap' and 'change_gap'
account settings.
Usage:
max_address_gap (<account_name> | --account=<account_name>)
Options:
--account=<account_name> : (str) account for which to get max gaps
Returns:
(map) maximum gap for change and receiving addresses
"""
return self.get_account_or_error('account', account_name).get_max_gap()
@requires("wallet")
def jsonrpc_fund(self, to_account, from_account, amount=0,
everything=False, outputs=1, broadcast=False):
"""
Transfer some amount (or --everything) to an account from another
account (can be the same account). Amounts are interpreted as LBC.
You can also spread the transfer across a number of --outputs (cannot
be used together with --everything).
Usage:
fund (<to_account> | --to_account=<to_account>)
(<from_account> | --from_account=<from_account>)
(<amount> | --amount=<amount> | --everything)
[<outputs> | --outputs=<outputs>]
[--broadcast]
Options:
--to_account=<to_account> : (str) send to this account
--from_account=<from_account> : (str) spend from this account
--amount=<amount> : (str) the amount to transfer lbc
--everything : (bool) transfer everything (excluding claims), default: false.
--outputs=<outputs> : (int) split payment across many outputs, default: 1.
--broadcast : (bool) actually broadcast the transaction, default: false.
Returns:
(map) maximum gap for change and receiving addresses
"""
to_account = self.get_account_or_error('to_account', to_account)
from_account = self.get_account_or_error('from_account', from_account)
amount = self.get_dewies_or_error('amount', amount) if amount else None
if not isinstance(outputs, int):
raise ValueError("--outputs must be an integer.")
if everything and outputs > 1:
raise ValueError("Using --everything along with --outputs is not supported.")
return from_account.fund(
to_account=to_account, amount=amount, everything=everything,
outputs=outputs, broadcast=broadcast
)
def get_account_or_error(self, argument: str, account_name: str, lbc_only=False):
for account in self.wallet.default_wallet.accounts:
if account.name == account_name:
if lbc_only and not isinstance(account, LBCAccount):
raise ValueError(
"Found '{}', but it's an {} ledger account. "
"'{}' requires specifying an LBC ledger account."
.format(account_name, account.ledger.symbol, argument)
)
return account
raise ValueError("Couldn't find an account named: '{}'.".format(account_name))
def error_if_account_exists(self, account_name: str):
for account in self.wallet.default_wallet.accounts:
if account.name == account_name:
raise ValueError("Account with name '{}' already exists.".format(account_name))
@staticmethod
def get_dewies_or_error(argument: str, amount: Union[str, int]):
if isinstance(amount, str):
if '.' in amount:
return int(Decimal(amount) * COIN)
elif amount.isdigit():
amount = int(amount)
if isinstance(amount, int):
return amount * COIN
raise ValueError("Invalid value for '{}' argument: {}".format(argument, amount))
def loggly_time_string(dt):
@ -3170,7 +3357,7 @@ def create_key_getter(field):
try:
value = value[key]
except KeyError as e:
errmsg = 'Failed to get "{}", key "{}" was not found.'
raise Exception(errmsg.format(field, e.message))
errmsg = "Failed to get '{}', key {} was not found."
raise Exception(errmsg.format(field, str(e)))
return value
return key_getter

View file

@ -1,224 +0,0 @@
import json
import os
import sys
import colorama
from docopt import docopt
from collections import OrderedDict
from lbrynet import conf
from lbrynet.core import utils
from lbrynet.daemon.auth.client import JSONRPCException, LBRYAPIClient, AuthAPIClient
from lbrynet.daemon.Daemon import Daemon
from lbrynet.core.system_info import get_platform
from jsonrpc.common import RPCError
from requests.exceptions import ConnectionError
from urllib2 import URLError, HTTPError
from httplib import UNAUTHORIZED
def remove_brackets(key):
if key.startswith("<") and key.endswith(">"):
return str(key[1:-1])
return key
def set_kwargs(parsed_args):
kwargs = OrderedDict()
for key, arg in parsed_args.iteritems():
if arg is None:
continue
elif key.startswith("--") and remove_brackets(key[2:]) not in kwargs:
k = remove_brackets(key[2:])
elif remove_brackets(key) not in kwargs:
k = remove_brackets(key)
kwargs[k] = guess_type(arg, k)
return kwargs
def main():
argv = sys.argv[1:]
# check if a config file has been specified. If so, shift
# all the arguments so that the parsing can continue without
# noticing
if len(argv) and argv[0] == "--conf":
if len(argv) < 2:
print_error("No config file specified for --conf option")
print_help()
return
conf.conf_file = argv[1]
argv = argv[2:]
if len(argv):
method, args = argv[0], argv[1:]
else:
print_help()
return
if method in ['help', '--help', '-h']:
if len(args) == 1:
print_help_for_command(args[0])
else:
print_help()
return
elif method in ['version', '--version']:
print utils.json_dumps_pretty(get_platform(get_ip=False))
return
if method not in Daemon.callable_methods:
if method not in Daemon.deprecated_methods:
print_error("\"%s\" is not a valid command." % method)
return
new_method = Daemon.deprecated_methods[method]._new_command
print_error("\"%s\" is deprecated, using \"%s\"." % (method, new_method))
method = new_method
fn = Daemon.callable_methods[method]
parsed = docopt(fn.__doc__, args)
kwargs = set_kwargs(parsed)
colorama.init()
conf.initialize_settings()
try:
api = LBRYAPIClient.get_client()
api.status()
except (URLError, ConnectionError) as err:
if isinstance(err, HTTPError) and err.code == UNAUTHORIZED:
api = AuthAPIClient.config()
# this can happen if the daemon is using auth with the --http-auth flag
# when the config setting is to not use it
try:
api.status()
except:
print_error("Daemon requires authentication, but none was provided.",
suggest_help=False)
return 1
else:
print_error("Could not connect to daemon. Are you sure it's running?",
suggest_help=False)
return 1
# TODO: check if port is bound. Error if its not
try:
result = api.call(method, kwargs)
if isinstance(result, basestring):
# printing the undumped string is prettier
print result
else:
print utils.json_dumps_pretty(result)
except (RPCError, KeyError, JSONRPCException, HTTPError) as err:
if isinstance(err, HTTPError):
error_body = err.read()
try:
error_data = json.loads(error_body)
except ValueError:
print (
"There was an error, and the response was not valid JSON.\n" +
"Raw JSONRPC response:\n" + error_body
)
return 1
print_error(error_data['error']['message'] + "\n", suggest_help=False)
if 'data' in error_data['error'] and 'traceback' in error_data['error']['data']:
print "Here's the traceback for the error you encountered:"
print "\n".join(error_data['error']['data']['traceback'])
print_help_for_command(method)
elif isinstance(err, RPCError):
print_error(err.msg, suggest_help=False)
# print_help_for_command(method)
else:
print_error("Something went wrong\n", suggest_help=False)
print str(err)
return 1
def guess_type(x, key=None):
if not isinstance(x, (unicode, str)):
return x
if key in ('uri', 'channel_name', 'name', 'file_name', 'download_directory'):
return x
if x in ('true', 'True', 'TRUE'):
return True
if x in ('false', 'False', 'FALSE'):
return False
if '.' in x:
try:
return float(x)
except ValueError:
# not a float
pass
try:
return int(x)
except ValueError:
return x
def print_help_suggestion():
print "See `{} help` for more information.".format(os.path.basename(sys.argv[0]))
def print_error(message, suggest_help=True):
error_style = colorama.Style.BRIGHT + colorama.Fore.RED
print error_style + "ERROR: " + message + colorama.Style.RESET_ALL
if suggest_help:
print_help_suggestion()
def print_help():
print "\n".join([
"NAME",
" lbrynet-cli - LBRY command line client.",
"",
"USAGE",
" lbrynet-cli [--conf <config file>] <command> [<args>]",
"",
"EXAMPLES",
" lbrynet-cli commands # list available commands",
" lbrynet-cli status # get daemon status",
" lbrynet-cli --conf ~/l1.conf status # like above but using ~/l1.conf as config file",
" lbrynet-cli resolve_name what # resolve a name",
" lbrynet-cli help resolve_name # get help for a command",
])
def print_help_for_command(command):
fn = Daemon.callable_methods.get(command)
if fn:
print "Help for %s method:\n%s" % (command, fn.__doc__)
def wrap_list_to_term_width(l, width=None, separator=', ', prefix=''):
if width is None:
try:
_, width = os.popen('stty size', 'r').read().split()
width = int(width)
except:
pass
if not width:
width = 80
lines = []
curr_line = ''
for item in l:
new_line = curr_line + item + separator
if len(new_line) + len(prefix) > width:
lines.append(curr_line)
curr_line = item + separator
else:
curr_line = new_line
lines.append(curr_line)
ret = prefix + ("\n" + prefix).join(lines)
if ret.endswith(separator):
ret = ret[:-len(separator)]
return ret
if __name__ == '__main__':
sys.exit(main())

View file

@ -1,11 +1,11 @@
# -*- coding: utf-8 -*-
import sys
import code
import argparse
import asyncio
import logging.handlers
from exceptions import SystemExit
from twisted.internet import defer, reactor, threads
from aiohttp import client_exceptions
from lbrynet import analytics
from lbrynet import conf
from lbrynet.core import utils
@ -13,8 +13,6 @@ from lbrynet.core import log_support
from lbrynet.daemon.auth.client import LBRYAPIClient
from lbrynet.daemon.Daemon import Daemon
get_client = LBRYAPIClient.get_client
log = logging.getLogger(__name__)
@ -117,12 +115,12 @@ def get_methods(daemon):
locs = {}
def wrapped(name, fn):
client = get_client()
client = LBRYAPIClient.get_client()
_fn = getattr(client, name)
_fn.__doc__ = fn.__doc__
return {name: _fn}
for method_name, method in daemon.callable_methods.iteritems():
for method_name, method in daemon.callable_methods.items():
locs.update(wrapped(method_name, method))
return locs
@ -133,14 +131,14 @@ def run_terminal(callable_methods, started_daemon, quiet=False):
def help(method_name=None):
if not method_name:
print "Available api functions: "
print("Available api functions: ")
for name in callable_methods:
print "\t%s" % name
print("\t%s" % name)
return
if method_name not in callable_methods:
print "\"%s\" is not a recognized api function"
print("\"%s\" is not a recognized api function")
return
print callable_methods[method_name].__doc__
print(callable_methods[method_name].__doc__)
return
locs.update({'help': help})
@ -148,7 +146,7 @@ def run_terminal(callable_methods, started_daemon, quiet=False):
if started_daemon:
def exit(status=None):
if not quiet:
print "Stopping lbrynet-daemon..."
print("Stopping lbrynet-daemon...")
callable_methods['daemon_stop']()
return sys.exit(status)
@ -158,7 +156,7 @@ def run_terminal(callable_methods, started_daemon, quiet=False):
try:
reactor.callLater(0, reactor.stop)
except Exception as err:
print "error stopping reactor: ", err
print("error stopping reactor: {}".format(err))
return sys.exit(status)
locs.update({'exit': exit})
@ -184,21 +182,21 @@ def threaded_terminal(started_daemon, quiet):
d.addErrback(log.exception)
def start_lbrynet_console(quiet, use_existing_daemon, useauth):
async def start_lbrynet_console(quiet, use_existing_daemon, useauth):
if not utils.check_connection():
print "Not connected to internet, unable to start"
print("Not connected to internet, unable to start")
raise Exception("Not connected to internet, unable to start")
if not quiet:
print "Starting lbrynet-console..."
print("Starting lbrynet-console...")
try:
get_client().status()
await LBRYAPIClient.get_client().status()
d = defer.succeed(False)
if not quiet:
print "lbrynet-daemon is already running, connecting to it..."
except:
print("lbrynet-daemon is already running, connecting to it...")
except client_exceptions.ClientConnectorError:
if not use_existing_daemon:
if not quiet:
print "Starting lbrynet-daemon..."
print("Starting lbrynet-daemon...")
analytics_manager = analytics.Manager.new_instance()
d = start_server_and_listen(useauth, analytics_manager, quiet)
else:
@ -225,7 +223,8 @@ def main():
"--http-auth", dest="useauth", action="store_true", default=conf.settings['use_auth_http']
)
args = parser.parse_args()
start_lbrynet_console(args.quiet, args.use_existing_daemon, args.useauth)
loop = asyncio.get_event_loop()
loop.run_until_complete(start_lbrynet_console(args.quiet, args.use_existing_daemon, args.useauth))
reactor.run()

View file

@ -13,7 +13,6 @@ import argparse
import logging.handlers
from twisted.internet import reactor
from jsonrpc.proxy import JSONRPCProxy
from lbrynet import conf
from lbrynet.core import utils, system_info
@ -26,20 +25,13 @@ def test_internet_connection():
return utils.check_connection()
def start():
"""The primary entry point for launching the daemon."""
def start(argv=None, conf_path=None):
if conf_path is not None:
conf.conf_file = conf_path
# postpone loading the config file to after the CLI arguments
# have been parsed, as they may contain an alternate config file location
conf.initialize_settings(load_conf_file=False)
conf.initialize_settings()
parser = argparse.ArgumentParser(description="Launch lbrynet-daemon")
parser.add_argument(
"--conf",
help="specify an alternative configuration file",
type=str,
default=None
)
parser = argparse.ArgumentParser()
parser.add_argument(
"--http-auth", dest="useauth", action="store_true", default=conf.settings['use_auth_http']
)
@ -57,15 +49,14 @@ def start():
help='Show daemon version and quit'
)
args = parser.parse_args()
update_settings_from_args(args)
conf.settings.load_conf_file_settings()
args = parser.parse_args(argv)
if args.useauth:
conf.settings.update({'use_auth_http': args.useauth}, data_types=(conf.TYPE_CLI,))
if args.version:
version = system_info.get_platform(get_ip=False)
version['installation_id'] = conf.settings.installation_id
print utils.json_dumps_pretty(version)
print(utils.json_dumps_pretty(version))
return
lbrynet_log = conf.settings.get_log_filename()
@ -73,14 +64,6 @@ def start():
log_support.configure_loggly_handler()
log.debug('Final Settings: %s', conf.settings.get_current_settings_dict())
try:
log.debug('Checking for an existing lbrynet daemon instance')
JSONRPCProxy.from_url(conf.settings.get_api_connection_string()).status()
log.info("lbrynet-daemon is already running")
return
except Exception:
log.debug('No lbrynet instance found, continuing to start')
log.info("Starting lbrynet-daemon from command line")
if test_internet_connection():
@ -89,17 +72,3 @@ def start():
reactor.run()
else:
log.info("Not connected to internet, unable to start")
def update_settings_from_args(args):
if args.conf:
conf.conf_file = args.conf
if args.useauth:
conf.settings.update({
'use_auth_http': args.useauth,
}, data_types=(conf.TYPE_CLI,))
if __name__ == "__main__":
start()

View file

@ -29,7 +29,7 @@ STREAM_STAGES = [
log = logging.getLogger(__name__)
class GetStream(object):
class GetStream:
def __init__(self, sd_identifier, wallet, exchange_rate_manager, blob_manager, peer_finder, rate_limiter,
payment_rate_manager, storage, max_key_fee, disable_max_key_fee, data_rate=None, timeout=None):
@ -162,7 +162,7 @@ class GetStream(object):
@defer.inlineCallbacks
def _initialize(self, stream_info):
# Set sd_hash and return key_fee from stream_info
self.sd_hash = stream_info.source_hash
self.sd_hash = stream_info.source_hash.decode()
key_fee = None
if stream_info.has_fee:
key_fee = yield self.check_fee_and_convert(stream_info.source_fee)

View file

@ -15,7 +15,7 @@ BITTREX_FEE = 0.0025
COINBASE_FEE = 0.0 # add fee
class ExchangeRate(object):
class ExchangeRate:
def __init__(self, market, spot, ts):
if not int(time.time()) - ts < 600:
raise ValueError('The timestamp is too dated.')
@ -34,7 +34,7 @@ class ExchangeRate(object):
return {'spot': self.spot, 'ts': self.ts}
class MarketFeed(object):
class MarketFeed:
REQUESTS_TIMEOUT = 20
EXCHANGE_RATE_UPDATE_RATE_SEC = 300
@ -96,8 +96,7 @@ class MarketFeed(object):
class BittrexFeed(MarketFeed):
def __init__(self):
MarketFeed.__init__(
self,
super().__init__(
"BTCLBC",
"Bittrex",
"https://bittrex.com/api/v1.1/public/getmarkethistory",
@ -122,8 +121,7 @@ class BittrexFeed(MarketFeed):
class LBRYioFeed(MarketFeed):
def __init__(self):
MarketFeed.__init__(
self,
super().__init__(
"BTCLBC",
"lbry.io",
"https://api.lbry.io/lbc/exchange_rate",
@ -140,8 +138,7 @@ class LBRYioFeed(MarketFeed):
class LBRYioBTCFeed(MarketFeed):
def __init__(self):
MarketFeed.__init__(
self,
super().__init__(
"USDBTC",
"lbry.io",
"https://api.lbry.io/lbc/exchange_rate",
@ -161,8 +158,7 @@ class LBRYioBTCFeed(MarketFeed):
class CryptonatorBTCFeed(MarketFeed):
def __init__(self):
MarketFeed.__init__(
self,
super().__init__(
"USDBTC",
"cryptonator.com",
"https://api.cryptonator.com/api/ticker/usd-btc",
@ -183,8 +179,7 @@ class CryptonatorBTCFeed(MarketFeed):
class CryptonatorFeed(MarketFeed):
def __init__(self):
MarketFeed.__init__(
self,
super().__init__(
"BTCLBC",
"cryptonator.com",
"https://api.cryptonator.com/api/ticker/btc-lbc",
@ -203,7 +198,7 @@ class CryptonatorFeed(MarketFeed):
return defer.succeed(float(json_response['ticker']['price']))
class ExchangeRateManager(object):
class ExchangeRateManager:
def __init__(self):
self.market_feeds = [
LBRYioBTCFeed(),

View file

@ -4,25 +4,23 @@ import os
from twisted.internet import defer
from lbrynet.core import file_utils
from lbrynet.file_manager.EncryptedFileCreator import create_lbry_file
log = logging.getLogger(__name__)
class Publisher(object):
def __init__(self, blob_manager, payment_rate_manager, storage, lbry_file_manager, wallet, certificate_id):
class Publisher:
def __init__(self, blob_manager, payment_rate_manager, storage, lbry_file_manager, wallet, certificate):
self.blob_manager = blob_manager
self.payment_rate_manager = payment_rate_manager
self.storage = storage
self.lbry_file_manager = lbry_file_manager
self.wallet = wallet
self.certificate_id = certificate_id
self.certificate = certificate
self.lbry_file = None
@defer.inlineCallbacks
def create_and_publish_stream(self, name, bid, claim_dict, file_path, claim_address=None,
change_address=None):
def create_and_publish_stream(self, name, bid, claim_dict, file_path, holding_address=None):
"""Create lbry file and make claim"""
log.info('Starting publish for %s', name)
if not os.path.isfile(file_path):
@ -31,7 +29,7 @@ class Publisher(object):
raise Exception("Cannot publish empty file {}".format(file_path))
file_name = os.path.basename(file_path)
with file_utils.get_read_handle(file_path) as read_handle:
with open(file_path, 'rb') as read_handle:
self.lbry_file = yield create_lbry_file(
self.blob_manager, self.storage, self.payment_rate_manager, self.lbry_file_manager, file_name,
read_handle
@ -43,11 +41,13 @@ class Publisher(object):
claim_dict['stream']['source']['sourceType'] = 'lbry_sd_hash'
claim_dict['stream']['source']['contentType'] = get_content_type(file_path)
claim_dict['stream']['source']['version'] = "_0_0_1" # need current version here
claim_out = yield self.make_claim(name, bid, claim_dict, claim_address, change_address)
tx = yield self.wallet.claim_name(
name, bid, claim_dict, self.certificate, holding_address
)
# check if we have a file already for this claim (if this is a publish update with a new stream)
old_stream_hashes = yield self.storage.get_old_stream_hashes_for_claim_id(
claim_out['claim_id'], self.lbry_file.stream_hash
tx.outputs[0].claim_id, self.lbry_file.stream_hash
)
if old_stream_hashes:
for lbry_file in filter(lambda l: l.stream_hash in old_stream_hashes,
@ -56,28 +56,22 @@ class Publisher(object):
log.info("Removed old stream for claim update: %s", lbry_file.stream_hash)
yield self.storage.save_content_claim(
self.lbry_file.stream_hash, "%s:%i" % (claim_out['txid'], claim_out['nout'])
self.lbry_file.stream_hash, tx.outputs[0].id
)
defer.returnValue(claim_out)
defer.returnValue(tx)
@defer.inlineCallbacks
def publish_stream(self, name, bid, claim_dict, stream_hash, claim_address=None, change_address=None):
def publish_stream(self, name, bid, claim_dict, stream_hash, holding_address=None):
"""Make a claim without creating a lbry file"""
claim_out = yield self.make_claim(name, bid, claim_dict, claim_address, change_address)
tx = yield self.wallet.claim_name(
name, bid, claim_dict, self.certificate, holding_address
)
if stream_hash: # the stream_hash returned from the db will be None if this isn't a stream we have
yield self.storage.save_content_claim(
stream_hash, "%s:%i" % (claim_out['txid'], claim_out['nout'])
stream_hash.decode(), tx.outputs[0].id
)
self.lbry_file = [f for f in self.lbry_file_manager.lbry_files if f.stream_hash == stream_hash][0]
defer.returnValue(claim_out)
@defer.inlineCallbacks
def make_claim(self, name, bid, claim_dict, claim_address=None, change_address=None):
claim_out = yield self.wallet.claim_name(name, bid, claim_dict,
certificate_id=self.certificate_id,
claim_address=claim_address,
change_address=change_address)
defer.returnValue(claim_out)
defer.returnValue(tx)
def get_content_type(filename):

View file

@ -1,4 +1 @@
from lbrynet import custom_logger
import Components # register Component classes
from lbrynet.daemon.auth.client import LBRYAPIClient
get_client = LBRYAPIClient.get_client
from . import Components # register Component classes

View file

@ -9,7 +9,7 @@ log = logging.getLogger(__name__)
@implementer(portal.IRealm)
class HttpPasswordRealm(object):
class HttpPasswordRealm:
def __init__(self, resource):
self.resource = resource
@ -21,7 +21,7 @@ class HttpPasswordRealm(object):
@implementer(checkers.ICredentialsChecker)
class PasswordChecker(object):
class PasswordChecker:
credentialInterfaces = (credentials.IUsernamePassword,)
def __init__(self, passwords):
@ -39,8 +39,12 @@ class PasswordChecker(object):
return cls(passwords)
def requestAvatarId(self, creds):
if creds.username in self.passwords:
pw = self.passwords.get(creds.username)
password_dict_bytes = {}
for api in self.passwords:
password_dict_bytes.update({api.encode(): self.passwords[api].encode()})
if creds.username in password_dict_bytes:
pw = password_dict_bytes.get(creds.username)
pw_match = creds.checkPassword(pw)
if pw_match:
return defer.succeed(creds.username)

View file

@ -1,10 +1,9 @@
import os
import json
import urlparse
import requests
from requests.cookies import RequestsCookieJar
import aiohttp
import logging
from jsonrpc.proxy import JSONRPCProxy
from urllib.parse import urlparse
from lbrynet import conf
from lbrynet.daemon.auth.util import load_api_keys, APIKey, API_KEY_NAME, get_auth_message
@ -13,28 +12,50 @@ USER_AGENT = "AuthServiceProxy/0.1"
TWISTED_SESSION = "TWISTED_SESSION"
LBRY_SECRET = "LBRY_SECRET"
HTTP_TIMEOUT = 30
def copy_cookies(cookies):
result = RequestsCookieJar()
result.update(cookies)
return result
SCHEME = "http"
class JSONRPCException(Exception):
def __init__(self, rpc_error):
Exception.__init__(self)
super().__init__()
self.error = rpc_error
class AuthAPIClient(object):
def __init__(self, key, timeout, connection, count, cookies, url, login_url):
class UnAuthAPIClient:
def __init__(self, host, port, session):
self.host = host
self.port = port
self.session = session
self.scheme = SCHEME
def __getattr__(self, method):
async def f(*args, **kwargs):
return await self.call(method, [args, kwargs])
return f
@classmethod
async def from_url(cls, url):
url_fragment = urlparse(url)
host = url_fragment.hostname
port = url_fragment.port
session = aiohttp.ClientSession()
return cls(host, port, session)
async def call(self, method, params=None):
message = {'method': method, 'params': params}
async with self.session.get('{}://{}:{}'.format(self.scheme, self.host, self.port), json=message) as resp:
return await resp.json()
class AuthAPIClient:
def __init__(self, key, session, cookies, url, login_url):
self.session = session
self.__api_key = key
self.__service_url = login_url
self.__id_count = count
self.__login_url = login_url
self.__id_count = 0
self.__url = url
self.__conn = connection
self.__cookies = copy_cookies(cookies)
self.__cookies = cookies
def __getattr__(self, name):
if name.startswith('__') and name.endswith('__'):
@ -45,9 +66,10 @@ class AuthAPIClient(object):
return f
def call(self, method, params=None):
async def call(self, method, params=None):
params = params or {}
self.__id_count += 1
pre_auth_post_data = {
'version': '2',
'method': method,
@ -55,85 +77,60 @@ class AuthAPIClient(object):
'id': self.__id_count
}
to_auth = get_auth_message(pre_auth_post_data)
pre_auth_post_data.update({'hmac': self.__api_key.get_hmac(to_auth)})
auth_msg = self.__api_key.get_hmac(to_auth).decode()
pre_auth_post_data.update({'hmac': auth_msg})
post_data = json.dumps(pre_auth_post_data)
cookies = copy_cookies(self.__cookies)
req = requests.Request(
method='POST', url=self.__service_url, data=post_data, cookies=cookies,
headers={
'Host': self.__url.hostname,
'User-Agent': USER_AGENT,
'Content-type': 'application/json'
}
)
http_response = self.__conn.send(req.prepare())
if http_response is None:
raise JSONRPCException({
'code': -342, 'message': 'missing HTTP response from server'})
http_response.raise_for_status()
next_secret = http_response.headers.get(LBRY_SECRET, False)
if next_secret:
self.__api_key.secret = next_secret
self.__cookies = copy_cookies(http_response.cookies)
response = http_response.json()
if response.get('error') is not None:
raise JSONRPCException(response['error'])
elif 'result' not in response:
raise JSONRPCException({
'code': -343, 'message': 'missing JSON-RPC result'})
else:
return response['result']
headers = {
'Host': self.__url.hostname,
'User-Agent': USER_AGENT,
'Content-type': 'application/json'
}
async with self.session.post(self.__login_url, data=post_data, headers=headers) as resp:
if resp is None:
raise JSONRPCException({'code': -342, 'message': 'missing HTTP response from server'})
resp.raise_for_status()
next_secret = resp.headers.get(LBRY_SECRET, False)
if next_secret:
self.__api_key.secret = next_secret
return await resp.json()
@classmethod
def config(cls, key_name=None, key=None, pw_path=None, timeout=HTTP_TIMEOUT, connection=None, count=0,
cookies=None, auth=None, url=None, login_url=None):
async def get_client(cls, key_name=None):
api_key_name = key_name or API_KEY_NAME
pw_path = os.path.join(conf.settings['data_dir'], ".api_keys") if not pw_path else pw_path
if not key:
keys = load_api_keys(pw_path)
api_key = keys.get(api_key_name, False)
else:
api_key = APIKey(name=api_key_name, secret=key)
if login_url is None:
service_url = "http://%s:%s@%s:%i/%s" % (api_key_name,
api_key.secret,
conf.settings['api_host'],
conf.settings['api_port'],
conf.settings['API_ADDRESS'])
else:
service_url = login_url
id_count = count
if auth is None and connection is None and cookies is None and url is None:
# This is a new client instance, start an authenticated session
url = urlparse.urlparse(service_url)
conn = requests.Session()
req = requests.Request(method='POST',
url=service_url,
headers={'Host': url.hostname,
'User-Agent': USER_AGENT,
'Content-type': 'application/json'},)
r = req.prepare()
http_response = conn.send(r)
cookies = RequestsCookieJar()
cookies.update(http_response.cookies)
uid = cookies.get(TWISTED_SESSION)
api_key = APIKey.new(seed=uid)
else:
# This is a client that already has a session, use it
conn = connection
if not cookies.get(LBRY_SECRET):
raise Exception("Missing cookie")
secret = cookies.get(LBRY_SECRET)
api_key = APIKey(secret, api_key_name)
return cls(api_key, timeout, conn, id_count, cookies, url, service_url)
pw_path = os.path.join(conf.settings['data_dir'], ".api_keys")
keys = load_api_keys(pw_path)
api_key = keys.get(api_key_name, False)
login_url = "http://{}:{}@{}:{}".format(api_key_name, api_key.secret, conf.settings['api_host'],
conf.settings['api_port'])
url = urlparse(login_url)
headers = {
'Host': url.hostname,
'User-Agent': USER_AGENT,
'Content-type': 'application/json'
}
session = aiohttp.ClientSession()
async with session.post(login_url, headers=headers) as r:
cookies = r.cookies
uid = cookies.get(TWISTED_SESSION).value
api_key = APIKey.new(seed=uid.encode())
return cls(api_key, session, cookies, url, login_url)
class LBRYAPIClient(object):
class LBRYAPIClient:
@staticmethod
def get_client():
def get_client(conf_path=None):
conf.conf_file = conf_path
if not conf.settings:
conf.initialize_settings()
return AuthAPIClient.config() if conf.settings['use_auth_http'] else \
JSONRPCProxy.from_url(conf.settings.get_api_connection_string())
return AuthAPIClient.get_client() if conf.settings['use_auth_http'] else \
UnAuthAPIClient.from_url(conf.settings.get_api_connection_string())

View file

@ -14,8 +14,8 @@ log = logging.getLogger(__name__)
class AuthJSONRPCResource(resource.Resource):
def __init__(self, protocol):
resource.Resource.__init__(self)
self.putChild("", protocol)
self.putChild(conf.settings['API_ADDRESS'], protocol)
self.putChild(b"", protocol)
self.putChild(conf.settings['API_ADDRESS'].encode(), protocol)
def getChild(self, name, request):
request.setHeader('cache-control', 'no-cache, no-store, must-revalidate')

View file

@ -1,13 +1,11 @@
import logging
import urlparse
from six.moves.urllib import parse as urlparse
import json
import inspect
import signal
from decimal import Decimal
from functools import wraps
from zope.interface import implements
from twisted.web import server, resource
from twisted.web import server
from twisted.internet import defer
from twisted.python.failure import Failure
from twisted.internet.error import ConnectionDone, ConnectionLost
@ -20,16 +18,16 @@ from lbrynet.core import utils
from lbrynet.core.Error import ComponentsNotStarted, ComponentStartConditionNotMet
from lbrynet.core.looping_call_manager import LoopingCallManager
from lbrynet.daemon.ComponentManager import ComponentManager
from lbrynet.undecorated import undecorated
from .util import APIKey, get_auth_message
from .client import LBRY_SECRET
from .util import APIKey, get_auth_message, LBRY_SECRET
from .undecorated import undecorated
from .factory import AuthJSONRPCResource
from lbrynet.daemon.json_response_encoder import JSONResponseEncoder
log = logging.getLogger(__name__)
EMPTY_PARAMS = [{}]
class JSONRPCError(object):
class JSONRPCError:
# http://www.jsonrpc.org/specification#error_object
CODE_PARSE_ERROR = -32700 # Invalid JSON. Error while parsing the JSON text.
CODE_INVALID_REQUEST = -32600 # The JSON sent is not a valid Request object.
@ -59,7 +57,7 @@ class JSONRPCError(object):
}
def __init__(self, message, code=CODE_APPLICATION_ERROR, traceback=None, data=None):
assert isinstance(code, (int, long)), "'code' must be an int"
assert isinstance(code, int), "'code' must be an int"
assert (data is None or isinstance(data, dict)), "'data' must be None or a dict"
self.code = code
if message is None:
@ -83,13 +81,8 @@ class JSONRPCError(object):
}
@classmethod
def create_from_exception(cls, exception, code=CODE_APPLICATION_ERROR, traceback=None):
return cls(exception.message, code=code, traceback=traceback)
def default_decimal(obj):
if isinstance(obj, Decimal):
return float(obj)
def create_from_exception(cls, message, code=CODE_APPLICATION_ERROR, traceback=None):
return cls(message, code=code, traceback=traceback)
class UnknownAPIMethodError(Exception):
@ -111,8 +104,7 @@ def jsonrpc_dumps_pretty(obj, **kwargs):
else:
data = {"jsonrpc": "2.0", "result": obj, "id": id_}
return json.dumps(data, cls=jsonrpclib.JSONRPCEncoder, sort_keys=True, indent=2,
separators=(',', ': '), **kwargs) + "\n"
return json.dumps(data, cls=JSONResponseEncoder, sort_keys=True, indent=2, **kwargs) + "\n"
class JSONRPCServerType(type):
@ -131,20 +123,19 @@ class JSONRPCServerType(type):
return klass
class AuthorizedBase(object):
__metaclass__ = JSONRPCServerType
class AuthorizedBase(metaclass=JSONRPCServerType):
@staticmethod
def deprecated(new_command=None):
def _deprecated_wrapper(f):
f._new_command = new_command
f.new_command = new_command
f._deprecated = True
return f
return _deprecated_wrapper
@staticmethod
def requires(*components, **conditions):
if conditions and ["conditions"] != conditions.keys():
if conditions and ["conditions"] != list(conditions.keys()):
raise SyntaxError("invalid conditions argument")
condition_names = conditions.get("conditions", [])
@ -189,7 +180,7 @@ class AuthJSONRPCServer(AuthorizedBase):
the server will randomize the shared secret and return the new value under the LBRY_SECRET header, which the
client uses to generate the token for their next request.
"""
implements(resource.IResource)
#implements(resource.IResource)
isLeaf = True
allowed_during_startup = []
@ -205,20 +196,23 @@ class AuthJSONRPCServer(AuthorizedBase):
skip_components=to_skip or [],
reactor=reactor
)
self.looping_call_manager = LoopingCallManager({n: lc for n, (lc, t) in (looping_calls or {}).iteritems()})
self._looping_call_times = {n: t for n, (lc, t) in (looping_calls or {}).iteritems()}
self.looping_call_manager = LoopingCallManager({n: lc for n, (lc, t) in (looping_calls or {}).items()})
self._looping_call_times = {n: t for n, (lc, t) in (looping_calls or {}).items()}
self._use_authentication = use_authentication or conf.settings['use_auth_http']
self.listening_port = None
self._component_setup_deferred = None
self.announced_startup = False
self.sessions = {}
self.server = None
@defer.inlineCallbacks
def start_listening(self):
from twisted.internet import reactor, error as tx_error
try:
reactor.listenTCP(
conf.settings['api_port'], self.get_server_factory(), interface=conf.settings['api_host']
self.server = self.get_server_factory()
self.listening_port = reactor.listenTCP(
conf.settings['api_port'], self.server, interface=conf.settings['api_host']
)
log.info("lbrynet API listening on TCP %s:%i", conf.settings['api_host'], conf.settings['api_port'])
yield self.setup()
@ -241,7 +235,7 @@ class AuthJSONRPCServer(AuthorizedBase):
reactor.addSystemEventTrigger('before', 'shutdown', self._shutdown)
if not self.analytics_manager.is_started:
self.analytics_manager.start()
for lc_name, lc_time in self._looping_call_times.iteritems():
for lc_name, lc_time in self._looping_call_times.items():
self.looping_call_manager.start(lc_name, lc_time)
def update_attribute(setup_result, component):
@ -259,7 +253,12 @@ class AuthJSONRPCServer(AuthorizedBase):
# ignore INT/TERM signals once shutdown has started
signal.signal(signal.SIGINT, self._already_shutting_down)
signal.signal(signal.SIGTERM, self._already_shutting_down)
if self.listening_port:
self.listening_port.stopListening()
self.looping_call_manager.shutdown()
if self.server is not None:
for session in list(self.server.sessions.values()):
session.expire()
if self.analytics_manager:
self.analytics_manager.shutdown()
try:
@ -287,8 +286,8 @@ class AuthJSONRPCServer(AuthorizedBase):
request.setHeader(LBRY_SECRET, self.sessions.get(session_id).secret)
@staticmethod
def _render_message(request, message):
request.write(message)
def _render_message(request, message: str):
request.write(message.encode())
request.finish()
def _render_error(self, failure, request, id_):
@ -299,8 +298,15 @@ class AuthJSONRPCServer(AuthorizedBase):
error = failure.check(JSONRPCError)
if error is None:
# maybe its a twisted Failure with another type of error
error = JSONRPCError(failure.getErrorMessage() or failure.type.__name__,
traceback=failure.getTraceback())
if hasattr(failure.type, "code"):
error_code = failure.type.code
else:
error_code = JSONRPCError.CODE_APPLICATION_ERROR
error = JSONRPCError.create_from_exception(
failure.getErrorMessage() or failure.type.__name__,
code=error_code,
traceback=failure.getTraceback()
)
if not failure.check(ComponentsNotStarted, ComponentStartConditionNotMet):
log.warning("error processing api request: %s\ntraceback: %s", error.message,
"\n".join(error.traceback))
@ -308,7 +314,7 @@ class AuthJSONRPCServer(AuthorizedBase):
# last resort, just cast it as a string
error = JSONRPCError(str(failure))
response_content = jsonrpc_dumps_pretty(error, id=id_)
response_content = jsonrpc_dumps_pretty(error, id=id_, ledger=self.ledger)
self._set_headers(request, response_content)
request.setResponseCode(200)
self._render_message(request, response_content)
@ -324,7 +330,7 @@ class AuthJSONRPCServer(AuthorizedBase):
return self._render(request)
except BaseException as e:
log.error(e)
error = JSONRPCError.create_from_exception(e, traceback=format_exc())
error = JSONRPCError.create_from_exception(str(e), traceback=format_exc())
self._render_error(error, request, None)
return server.NOT_DONE_YET
@ -344,7 +350,6 @@ class AuthJSONRPCServer(AuthorizedBase):
def expire_session():
self._unregister_user_session(session_id)
session.startCheckingExpiration()
session.notifyOnExpire(expire_session)
message = "OK"
request.setResponseCode(200)
@ -355,12 +360,12 @@ class AuthJSONRPCServer(AuthorizedBase):
session.touch()
request.content.seek(0, 0)
content = request.content.read()
content = request.content.read().decode()
try:
parsed = jsonrpclib.loads(content)
except ValueError:
except json.JSONDecodeError:
log.warning("Unable to decode request json")
self._render_error(JSONRPCError(None, JSONRPCError.CODE_PARSE_ERROR), request, None)
self._render_error(JSONRPCError(None, code=JSONRPCError.CODE_PARSE_ERROR), request, None)
return server.NOT_DONE_YET
request_id = None
@ -384,7 +389,8 @@ class AuthJSONRPCServer(AuthorizedBase):
log.warning("API validation failed")
self._render_error(
JSONRPCError.create_from_exception(
err, code=JSONRPCError.CODE_AUTHENTICATION_ERROR,
str(err),
code=JSONRPCError.CODE_AUTHENTICATION_ERROR,
traceback=format_exc()
),
request, request_id
@ -399,12 +405,12 @@ class AuthJSONRPCServer(AuthorizedBase):
except UnknownAPIMethodError as err:
log.warning('Failed to get function %s: %s', function_name, err)
self._render_error(
JSONRPCError(None, JSONRPCError.CODE_METHOD_NOT_FOUND),
JSONRPCError(None, code=JSONRPCError.CODE_METHOD_NOT_FOUND),
request, request_id
)
return server.NOT_DONE_YET
if args == EMPTY_PARAMS or args == []:
if args in (EMPTY_PARAMS, []):
_args, _kwargs = (), {}
elif isinstance(args, dict):
_args, _kwargs = (), args
@ -510,7 +516,7 @@ class AuthJSONRPCServer(AuthorizedBase):
def _get_jsonrpc_method(self, function_path):
if function_path in self.deprecated_methods:
new_command = self.deprecated_methods[function_path]._new_command
new_command = self.deprecated_methods[function_path].new_command
log.warning('API function \"%s\" is deprecated, please update to use \"%s\"',
function_path, new_command)
function_path = new_command
@ -519,7 +525,7 @@ class AuthJSONRPCServer(AuthorizedBase):
@staticmethod
def _check_params(function, args_tup, args_dict):
argspec = inspect.getargspec(undecorated(function))
argspec = inspect.getfullargspec(undecorated(function))
num_optional_params = 0 if argspec.defaults is None else len(argspec.defaults)
duplicate_params = [
@ -539,7 +545,7 @@ class AuthJSONRPCServer(AuthorizedBase):
if len(missing_required_params):
return 'Missing required parameters', missing_required_params
extraneous_params = [] if argspec.keywords is not None else [
extraneous_params = [] if argspec.varkw is not None else [
extra_param
for extra_param in args_dict
if extra_param not in argspec.args[1:]
@ -568,10 +574,10 @@ class AuthJSONRPCServer(AuthorizedBase):
def _callback_render(self, result, request, id_, auth_required=False):
try:
encoded_message = jsonrpc_dumps_pretty(result, id=id_, default=default_decimal)
message = jsonrpc_dumps_pretty(result, id=id_, ledger=self.ledger)
request.setResponseCode(200)
self._set_headers(request, encoded_message, auth_required)
self._render_message(request, encoded_message)
self._set_headers(request, message, auth_required)
self._render_message(request, message)
except Exception as err:
log.exception("Failed to render API response: %s", result)
self._render_error(err, request, id_)

View file

@ -33,11 +33,11 @@ def undecorated(o):
except AttributeError:
pass
# try:
# # python3
# closure = o.__closure__
# except AttributeError:
# return
try:
# python3
closure = o.__closure__
except AttributeError:
return
if closure:
for cell in closure:

View file

@ -9,21 +9,22 @@ import logging
log = logging.getLogger(__name__)
API_KEY_NAME = "api"
LBRY_SECRET = "LBRY_SECRET"
def sha(x):
def sha(x: bytes) -> bytes:
h = hashlib.sha256(x).digest()
return base58.b58encode(h)
def generate_key(x=None):
def generate_key(x: bytes = None) -> bytes:
if x is None:
return sha(os.urandom(256))
else:
return sha(x)
class APIKey(object):
class APIKey:
def __init__(self, secret, name, expiration=None):
self.secret = secret
self.name = name
@ -40,7 +41,7 @@ class APIKey(object):
def get_hmac(self, message):
decoded_key = self._raw_key()
signature = hmac.new(decoded_key, message, hashlib.sha256)
signature = hmac.new(decoded_key, message.encode(), hashlib.sha256)
return base58.b58encode(signature.digest())
def compare_hmac(self, message, token):
@ -65,7 +66,7 @@ def load_api_keys(path):
keys_for_return = {}
for key_name in data:
key = data[key_name]
secret = key['secret']
secret = key['secret'].decode()
expiration = key['expiration']
keys_for_return.update({key_name: APIKey(secret, key_name, expiration)})
return keys_for_return

View file

@ -0,0 +1,46 @@
from decimal import Decimal
from binascii import hexlify
from datetime import datetime
from json import JSONEncoder
from lbrynet.wallet.transaction import Transaction, Output
class JSONResponseEncoder(JSONEncoder):
def __init__(self, *args, ledger, **kwargs):
super().__init__(*args, **kwargs)
self.ledger = ledger
def default(self, obj): # pylint: disable=method-hidden
if isinstance(obj, Transaction):
return self.encode_transaction(obj)
if isinstance(obj, Output):
return self.encode_output(obj)
if isinstance(obj, datetime):
return obj.strftime("%Y%m%dT%H:%M:%S")
if isinstance(obj, Decimal):
return float(obj)
if isinstance(obj, bytes):
return obj.decode()
return super().default(obj)
def encode_transaction(self, tx):
return {
'txid': tx.id,
'inputs': [self.encode_input(txo) for txo in tx.inputs],
'outputs': [self.encode_output(txo) for txo in tx.outputs],
'total_input': tx.input_sum,
'total_output': tx.input_sum - tx.fee,
'total_fee': tx.fee,
'hex': hexlify(tx.raw).decode(),
}
def encode_output(self, txo):
return {
'nout': txo.position,
'amount': txo.amount,
'address': txo.get_address(self.ledger)
}
def encode_input(self, txi):
return self.encode_output(txi.txo_ref.txo)

View file

@ -39,7 +39,7 @@ def migrate_blobs_db(db_dir):
blobs_db_cursor.execute(
"ALTER TABLE blobs ADD COLUMN should_announce integer NOT NULL DEFAULT 0")
else:
log.warn("should_announce already exists somehow, proceeding anyways")
log.warning("should_announce already exists somehow, proceeding anyways")
# if lbryfile_info.db doesn't exist, skip marking blobs as should_announce = True
if not os.path.isfile(lbryfile_info_db):
@ -83,4 +83,3 @@ def migrate_blobs_db(db_dir):
blobs_db_file.commit()
blobs_db_file.close()
lbryfile_info_file.close()

View file

@ -247,7 +247,7 @@ def do_migration(db_dir):
claim_queries = {} # <sd_hash>: claim query tuple
# get the claim queries ready, only keep those with associated files
for outpoint, sd_hash in file_outpoints.iteritems():
for outpoint, sd_hash in file_outpoints.items():
if outpoint in claim_outpoint_queries:
claim_queries[sd_hash] = claim_outpoint_queries[outpoint]
@ -260,7 +260,7 @@ def do_migration(db_dir):
claim_arg_tup[7], claim_arg_tup[6], claim_arg_tup[8],
smart_decode(claim_arg_tup[8]).certificate_id, claim_arg_tup[5], claim_arg_tup[4]
)
for sd_hash, claim_arg_tup in claim_queries.iteritems() if claim_arg_tup
for sd_hash, claim_arg_tup in claim_queries.items() if claim_arg_tup
] # sd_hash, (txid, nout, claim_id, name, sequence, address, height, amount, serialized)
)
@ -268,7 +268,7 @@ def do_migration(db_dir):
damaged_stream_sds = []
# import the files and get sd hashes of streams to attempt recovering
for sd_hash, file_query in file_args.iteritems():
for sd_hash, file_query in file_args.items():
failed_sd = _import_file(*file_query)
if failed_sd:
damaged_stream_sds.append(failed_sd)

View file

@ -2,6 +2,7 @@ import logging
import os
import sqlite3
import traceback
from binascii import hexlify, unhexlify
from decimal import Decimal
from twisted.internet import defer, task, threads
from twisted.enterprise import adbapi
@ -11,7 +12,8 @@ from lbryschema.decode import smart_decode
from lbrynet import conf
from lbrynet.cryptstream.CryptBlob import CryptBlobInfo
from lbrynet.dht.constants import dataExpireTimeout
from lbryum.constants import COIN
from lbrynet.wallet.database import WalletDatabase
from torba.constants import COIN
log = logging.getLogger(__name__)
@ -83,18 +85,19 @@ def rerun_if_locked(f):
class SqliteConnection(adbapi.ConnectionPool):
def __init__(self, db_path):
adbapi.ConnectionPool.__init__(self, 'sqlite3', db_path, check_same_thread=False)
super().__init__('sqlite3', db_path, check_same_thread=False)
@rerun_if_locked
def runInteraction(self, interaction, *args, **kw):
return adbapi.ConnectionPool.runInteraction(self, interaction, *args, **kw)
return super().runInteraction(interaction, *args, **kw)
@classmethod
def set_reactor(cls, reactor):
cls.reactor = reactor
class SQLiteStorage(object):
class SQLiteStorage:
CREATE_TABLES_QUERY = """
pragma foreign_keys=on;
pragma journal_mode=WAL;
@ -164,7 +167,7 @@ class SQLiteStorage(object):
timestamp integer,
primary key (sd_hash, reflector_address)
);
"""
""" + WalletDatabase.CREATE_TABLES_QUERY
def __init__(self, db_dir, reactor=None):
if not reactor:
@ -209,6 +212,12 @@ class SQLiteStorage(object):
else:
defer.returnValue([])
def run_and_return_id(self, query, *args):
def do_save(t):
t.execute(query, args)
return t.lastrowid
return self.db.runInteraction(do_save)
def stop(self):
if self.check_should_announce_lc and self.check_should_announce_lc.running:
self.check_should_announce_lc.stop()
@ -259,7 +268,7 @@ class SQLiteStorage(object):
blob_hashes = yield self.run_and_return_list(
"select blob_hash from blob where status='finished'"
)
defer.returnValue([blob_hash.decode('hex') for blob_hash in blob_hashes])
defer.returnValue([unhexlify(blob_hash) for blob_hash in blob_hashes])
def count_finished_blobs(self):
return self.run_and_return_one_or_none(
@ -483,21 +492,17 @@ class SQLiteStorage(object):
@defer.inlineCallbacks
def save_downloaded_file(self, stream_hash, file_name, download_directory, data_payment_rate):
# touch the closest available file to the file name
file_name = yield open_file_for_writing(download_directory.decode('hex'), file_name.decode('hex'))
file_name = yield open_file_for_writing(unhexlify(download_directory).decode(), unhexlify(file_name).decode())
result = yield self.save_published_file(
stream_hash, file_name.encode('hex'), download_directory, data_payment_rate
stream_hash, hexlify(file_name.encode()), download_directory, data_payment_rate
)
defer.returnValue(result)
def save_published_file(self, stream_hash, file_name, download_directory, data_payment_rate, status="stopped"):
def do_save(db_transaction):
db_transaction.execute(
"insert into file values (?, ?, ?, ?, ?)",
(stream_hash, file_name, download_directory, data_payment_rate, status)
)
file_rowid = db_transaction.lastrowid
return file_rowid
return self.db.runInteraction(do_save)
return self.run_and_return_id(
"insert into file values (?, ?, ?, ?, ?)",
stream_hash, file_name, download_directory, data_payment_rate, status
)
def get_filename_for_rowid(self, rowid):
return self.run_and_return_one_or_none("select file_name from file where rowid=?", rowid)
@ -609,7 +614,7 @@ class SQLiteStorage(object):
source_hash = None
except AttributeError:
source_hash = None
serialized = claim_info.get('hex') or smart_decode(claim_info['value']).serialized.encode('hex')
serialized = claim_info.get('hex') or hexlify(smart_decode(claim_info['value']).serialized)
transaction.execute(
"insert or replace into claim values (?, ?, ?, ?, ?, ?, ?, ?, ?)",
(outpoint, claim_id, name, amount, height, serialized, certificate_id, address, sequence)
@ -651,6 +656,19 @@ class SQLiteStorage(object):
if support_dl:
yield defer.DeferredList(support_dl)
def save_claims_for_resolve(self, claim_infos):
to_save = []
for info in claim_infos:
if 'value' in info:
if info['value']:
to_save.append(info)
else:
if 'certificate' in info and info['certificate']['value']:
to_save.append(info['certificate'])
if 'claim' in info and info['claim']['value']:
to_save.append(info['claim'])
return self.save_claims(to_save)
def get_old_stream_hashes_for_claim_id(self, claim_id, new_stream_hash):
return self.run_and_return_list(
"select f.stream_hash from file f "
@ -667,7 +685,7 @@ class SQLiteStorage(object):
).fetchone()
if not claim_info:
raise Exception("claim not found")
new_claim_id, claim = claim_info[0], ClaimDict.deserialize(claim_info[1].decode('hex'))
new_claim_id, claim = claim_info[0], ClaimDict.deserialize(unhexlify(claim_info[1]))
# certificate claims should not be in the content_claim table
if not claim.is_stream:
@ -680,7 +698,7 @@ class SQLiteStorage(object):
if not known_sd_hash:
raise Exception("stream not found")
# check the claim contains the same sd hash
if known_sd_hash[0] != claim.source_hash:
if known_sd_hash[0].encode() != claim.source_hash:
raise Exception("stream mismatch")
# if there is a current claim associated to the file, check that the new claim is an update to it
@ -828,7 +846,7 @@ class SQLiteStorage(object):
def save_claim_tx_heights(self, claim_tx_heights):
def _save_claim_heights(transaction):
for outpoint, height in claim_tx_heights.iteritems():
for outpoint, height in claim_tx_heights.items():
transaction.execute(
"update claim set height=? where claim_outpoint=? and height=-1",
(height, outpoint)
@ -864,7 +882,7 @@ def _format_claim_response(outpoint, claim_id, name, amount, height, serialized,
"claim_id": claim_id,
"address": address,
"claim_sequence": claim_sequence,
"value": ClaimDict.deserialize(serialized.decode('hex')).claim_dict,
"value": ClaimDict.deserialize(unhexlify(serialized)).claim_dict,
"height": height,
"amount": float(Decimal(amount) / Decimal(COIN)),
"nout": int(outpoint.split(":")[1]),

View file

@ -1,16 +1,18 @@
import ipaddress
from binascii import hexlify
from functools import reduce
from lbrynet.dht import constants
def is_valid_ipv4(address):
try:
ip = ipaddress.ip_address(address.decode()) # this needs to be unicode, thus the decode()
ip = ipaddress.ip_address(address)
return ip.version == 4
except ipaddress.AddressValueError:
return False
class _Contact(object):
class _Contact:
""" Encapsulation for remote contact
This class contains information on a single remote contact, and also
@ -19,8 +21,8 @@ class _Contact(object):
def __init__(self, contactManager, id, ipAddress, udpPort, networkProtocol, firstComm):
if id is not None:
if not len(id) == constants.key_bits / 8:
raise ValueError("invalid node id: %s" % id.encode('hex'))
if not len(id) == constants.key_bits // 8:
raise ValueError("invalid node id: {}".format(hexlify(id).decode()))
if not 0 <= udpPort <= 65536:
raise ValueError("invalid port")
if not is_valid_ipv4(ipAddress):
@ -56,7 +58,7 @@ class _Contact(object):
def log_id(self, short=True):
if not self.id:
return "not initialized"
id_hex = self.id.encode('hex')
id_hex = hexlify(self.id)
return id_hex if not short else id_hex[:8]
@property
@ -95,25 +97,17 @@ class _Contact(object):
return None
def __eq__(self, other):
if isinstance(other, _Contact):
return self.id == other.id
elif isinstance(other, str):
return self.id == other
else:
return False
if not isinstance(other, _Contact):
raise TypeError("invalid type to compare with Contact: %s" % str(type(other)))
return (self.id, self.address, self.port) == (other.id, other.address, other.port)
def __ne__(self, other):
if isinstance(other, _Contact):
return self.id != other.id
elif isinstance(other, str):
return self.id != other
else:
return True
def __hash__(self):
return hash((self.id, self.address, self.port))
def compact_ip(self):
compact_ip = reduce(
lambda buff, x: buff + bytearray([int(x)]), self.address.split('.'), bytearray())
return str(compact_ip)
return compact_ip
def set_id(self, id):
if not self._id:
@ -156,12 +150,12 @@ class _Contact(object):
raise AttributeError("unknown command: %s" % name)
def _sendRPC(*args, **kwargs):
return self._networkProtocol.sendRPC(self, name, args)
return self._networkProtocol.sendRPC(self, name.encode(), args)
return _sendRPC
class ContactManager(object):
class ContactManager:
def __init__(self, get_time=None):
if not get_time:
from twisted.internet import reactor
@ -171,12 +165,11 @@ class ContactManager(object):
self._rpc_failures = {}
def get_contact(self, id, address, port):
for contact in self._contacts.itervalues():
for contact in self._contacts.values():
if contact.id == id and contact.address == address and contact.port == port:
return contact
def make_contact(self, id, ipAddress, udpPort, networkProtocol, firstComm=0):
ipAddress = str(ipAddress)
contact = self.get_contact(id, ipAddress, udpPort)
if contact:
return contact

View file

@ -1,27 +1,21 @@
import UserDict
import constants
from interface import IDataStore
from zope.interface import implements
from collections import UserDict
from . import constants
class DictDataStore(UserDict.DictMixin):
class DictDataStore(UserDict):
""" A datastore using an in-memory Python dictionary """
implements(IDataStore)
#implements(IDataStore)
def __init__(self, getTime=None):
# Dictionary format:
# { <key>: (<contact>, <value>, <lastPublished>, <originallyPublished> <originalPublisherID>) }
self._dict = {}
super().__init__()
if not getTime:
from twisted.internet import reactor
getTime = reactor.seconds
self._getTime = getTime
self.completed_blobs = set()
def keys(self):
""" Return a list of the keys in this data store """
return self._dict.keys()
def filter_bad_and_expired_peers(self, key):
"""
Returns only non-expired and unknown/good peers
@ -29,41 +23,44 @@ class DictDataStore(UserDict.DictMixin):
return filter(
lambda peer:
self._getTime() - peer[3] < constants.dataExpireTimeout and peer[0].contact_is_good is not False,
self._dict[key]
self[key]
)
def filter_expired_peers(self, key):
"""
Returns only non-expired peers
"""
return filter(lambda peer: self._getTime() - peer[3] < constants.dataExpireTimeout, self._dict[key])
return filter(lambda peer: self._getTime() - peer[3] < constants.dataExpireTimeout, self[key])
def removeExpiredPeers(self):
for key in self._dict.keys():
unexpired_peers = self.filter_expired_peers(key)
expired_keys = []
for key in self.keys():
unexpired_peers = list(self.filter_expired_peers(key))
if not unexpired_peers:
del self._dict[key]
expired_keys.append(key)
else:
self._dict[key] = unexpired_peers
self[key] = unexpired_peers
for key in expired_keys:
del self[key]
def hasPeersForBlob(self, key):
return True if key in self._dict and len(self.filter_bad_and_expired_peers(key)) else False
return True if key in self and len(tuple(self.filter_bad_and_expired_peers(key))) else False
def addPeerToBlob(self, contact, key, compact_address, lastPublished, originallyPublished, originalPublisherID):
if key in self._dict:
if compact_address not in map(lambda store_tuple: store_tuple[1], self._dict[key]):
self._dict[key].append(
if key in self:
if compact_address not in map(lambda store_tuple: store_tuple[1], self[key]):
self[key].append(
(contact, compact_address, lastPublished, originallyPublished, originalPublisherID)
)
else:
self._dict[key] = [(contact, compact_address, lastPublished, originallyPublished, originalPublisherID)]
self[key] = [(contact, compact_address, lastPublished, originallyPublished, originalPublisherID)]
def getPeersForBlob(self, key):
return [] if key not in self._dict else [val[1] for val in self.filter_bad_and_expired_peers(key)]
return [] if key not in self else [val[1] for val in self.filter_bad_and_expired_peers(key)]
def getStoringContacts(self):
contacts = set()
for key in self._dict:
for values in self._dict[key]:
for key in self:
for values in self[key]:
contacts.add(values[0])
return list(contacts)

View file

@ -1,21 +1,21 @@
from lbrynet.dht import constants
class Distance(object):
class Distance:
"""Calculate the XOR result between two string variables.
Frequently we re-use one of the points so as an optimization
we pre-calculate the long value of that point.
we pre-calculate the value of that point.
"""
def __init__(self, key):
if len(key) != constants.key_bits / 8:
if len(key) != constants.key_bits // 8:
raise ValueError("invalid key length: %i" % len(key))
self.key = key
self.val_key_one = long(key.encode('hex'), 16)
self.val_key_one = int.from_bytes(key, 'big')
def __call__(self, key_two):
val_key_two = long(key_two.encode('hex'), 16)
val_key_two = int.from_bytes(key_two, 'big')
return self.val_key_one ^ val_key_two
def is_closer(self, a, b):

View file

@ -1,134 +1,75 @@
from error import DecodeError
from .error import DecodeError
class Encoding(object):
""" Interface for RPC message encoders/decoders
All encoding implementations used with this library should inherit and
implement this.
"""
def encode(self, data):
""" Encode the specified data
@param data: The data to encode
This method has to support encoding of the following
types: C{str}, C{int} and C{long}
Any additional data types may be supported as long as the
implementing class's C{decode()} method can successfully
decode them.
@return: The encoded data
@rtype: str
"""
def decode(self, data):
""" Decode the specified data string
@param data: The data (byte string) to decode.
@type data: str
@return: The decoded data (in its correct type)
"""
def bencode(data):
""" Encoder implementation of the Bencode algorithm (Bittorrent). """
if isinstance(data, int):
return b'i%de' % data
elif isinstance(data, (bytes, bytearray)):
return b'%d:%s' % (len(data), data)
elif isinstance(data, str):
return b'%d:%s' % (len(data), data.encode())
elif isinstance(data, (list, tuple)):
encoded_list_items = b''
for item in data:
encoded_list_items += bencode(item)
return b'l%se' % encoded_list_items
elif isinstance(data, dict):
encoded_dict_items = b''
keys = data.keys()
for key in sorted(keys):
encoded_dict_items += bencode(key)
encoded_dict_items += bencode(data[key])
return b'd%se' % encoded_dict_items
else:
raise TypeError("Cannot bencode '%s' object" % type(data))
class Bencode(Encoding):
""" Implementation of a Bencode-based algorithm (Bencode is the encoding
algorithm used by Bittorrent).
def bdecode(data):
""" Decoder implementation of the Bencode algorithm. """
assert type(data) == bytes # fixme: _maybe_ remove this after porting
if len(data) == 0:
raise DecodeError('Cannot decode empty string')
try:
return _decode_recursive(data)[0]
except ValueError as e:
raise DecodeError(str(e))
@note: This algorithm differs from the "official" Bencode algorithm in
that it can encode/decode floating point values in addition to
integers.
"""
def encode(self, data):
""" Encoder implementation of the Bencode algorithm
@param data: The data to encode
@type data: int, long, tuple, list, dict or str
@return: The encoded data
@rtype: str
"""
if isinstance(data, (int, long)):
return 'i%de' % data
elif isinstance(data, str):
return '%d:%s' % (len(data), data)
elif isinstance(data, (list, tuple)):
encodedListItems = ''
for item in data:
encodedListItems += self.encode(item)
return 'l%se' % encodedListItems
elif isinstance(data, dict):
encodedDictItems = ''
keys = data.keys()
keys.sort()
for key in keys:
encodedDictItems += self.encode(key) # TODO: keys should always be bytestrings
encodedDictItems += self.encode(data[key])
return 'd%se' % encodedDictItems
else:
print data
raise TypeError("Cannot bencode '%s' object" % type(data))
def decode(self, data):
""" Decoder implementation of the Bencode algorithm
@param data: The encoded data
@type data: str
@note: This is a convenience wrapper for the recursive decoding
algorithm, C{_decodeRecursive}
@return: The decoded data, as a native Python type
@rtype: int, list, dict or str
"""
if len(data) == 0:
raise DecodeError('Cannot decode empty string')
def _decode_recursive(data, start_index=0):
if data[start_index] == ord('i'):
end_pos = data[start_index:].find(b'e') + start_index
return int(data[start_index + 1:end_pos]), end_pos + 1
elif data[start_index] == ord('l'):
start_index += 1
decoded_list = []
while data[start_index] != ord('e'):
list_data, start_index = _decode_recursive(data, start_index)
decoded_list.append(list_data)
return decoded_list, start_index + 1
elif data[start_index] == ord('d'):
start_index += 1
decoded_dict = {}
while data[start_index] != ord('e'):
key, start_index = _decode_recursive(data, start_index)
value, start_index = _decode_recursive(data, start_index)
decoded_dict[key] = value
return decoded_dict, start_index
elif data[start_index] == ord('f'):
# This (float data type) is a non-standard extension to the original Bencode algorithm
end_pos = data[start_index:].find(b'e') + start_index
return float(data[start_index + 1:end_pos]), end_pos + 1
elif data[start_index] == ord('n'):
# This (None/NULL data type) is a non-standard extension
# to the original Bencode algorithm
return None, start_index + 1
else:
split_pos = data[start_index:].find(b':') + start_index
try:
return self._decodeRecursive(data)[0]
except ValueError as e:
raise DecodeError(e.message)
@staticmethod
def _decodeRecursive(data, startIndex=0):
""" Actual implementation of the recursive Bencode algorithm
Do not call this; use C{decode()} instead
"""
if data[startIndex] == 'i':
endPos = data[startIndex:].find('e') + startIndex
return int(data[startIndex + 1:endPos]), endPos + 1
elif data[startIndex] == 'l':
startIndex += 1
decodedList = []
while data[startIndex] != 'e':
listData, startIndex = Bencode._decodeRecursive(data, startIndex)
decodedList.append(listData)
return decodedList, startIndex + 1
elif data[startIndex] == 'd':
startIndex += 1
decodedDict = {}
while data[startIndex] != 'e':
key, startIndex = Bencode._decodeRecursive(data, startIndex)
value, startIndex = Bencode._decodeRecursive(data, startIndex)
decodedDict[key] = value
return decodedDict, startIndex
elif data[startIndex] == 'f':
# This (float data type) is a non-standard extension to the original Bencode algorithm
endPos = data[startIndex:].find('e') + startIndex
return float(data[startIndex + 1:endPos]), endPos + 1
elif data[startIndex] == 'n':
# This (None/NULL data type) is a non-standard extension
# to the original Bencode algorithm
return None, startIndex + 1
else:
splitPos = data[startIndex:].find(':') + startIndex
try:
length = int(data[startIndex:splitPos])
except ValueError, e:
raise DecodeError, e
startIndex = splitPos + 1
endPos = startIndex + length
bytes = data[startIndex:endPos]
return bytes, endPos
length = int(data[start_index:split_pos])
except ValueError:
raise DecodeError()
start_index = split_pos + 1
end_pos = start_index + length
b = data[start_index:end_pos]
return b, end_pos

Some files were not shown because too many files have changed in this diff Show more