Merge branch 'lbryum-refactor'

This commit is contained in:
Jack Robison 2018-08-24 15:04:35 -04:00
commit b101fafd39
No known key found for this signature in database
GPG key ID: DF25C68FE0239BB2
218 changed files with 5423 additions and 4822 deletions

30
.gitignore vendored
View file

@ -1,25 +1,9 @@
*.pyc /build
*.egg /dist
*.so /.tox
*.xml /.idea
*.iml /.coverage
*.log
*.pem
*.decTest
*.prof
.#*
/build/build lbrynet.egg-info
/build/dist __pycache__
/bulid/requirements_base.txt
/lbrynet.egg-info
/docs_build
/lbry-venv
.idea/
.coverage
.DS_Store
# temporary files from the twisted.trial test runner
_trial_temp/ _trial_temp/

View file

@ -121,7 +121,11 @@ disable=
unidiomatic-typecheck, unidiomatic-typecheck,
global-at-module-level, global-at-module-level,
inconsistent-return-statements, inconsistent-return-statements,
keyword-arg-before-vararg keyword-arg-before-vararg,
assignment-from-no-return,
useless-return,
assignment-from-none,
stop-iteration-return
[REPORTS] [REPORTS]
@ -386,7 +390,7 @@ int-import-graph=
[DESIGN] [DESIGN]
# Maximum number of arguments for function / method # Maximum number of arguments for function / method
max-args=5 max-args=10
# Argument names that match this expression will be ignored. Default to name # Argument names that match this expression will be ignored. Default to name
# with leading underscore # with leading underscore
@ -405,7 +409,7 @@ max-branches=12
max-statements=50 max-statements=50
# Maximum number of parents for a class (see R0901). # Maximum number of parents for a class (see R0901).
max-parents=7 max-parents=8
# Maximum number of attributes for a class (see R0902). # Maximum number of attributes for a class (see R0902).
max-attributes=7 max-attributes=7

View file

@ -1,42 +1,105 @@
os: linux sudo: required
dist: trusty dist: xenial
language: python language: python
python: 2.7 python: "3.7"
branches: jobs:
except: include:
- gh-pages
- stage: code quality
name: "pylint lbrynet"
install:
- pip install pylint
- pip install git+https://github.com/lbryio/torba.git
- pip install git+https://github.com/lbryio/lbryschema.git
- pip install -e .
script: pylint lbrynet
- &tests
stage: test
name: "Unit Tests w/ Python 3.7"
install:
- pip install coverage
- pip install git+https://github.com/lbryio/torba.git
- pip install git+https://github.com/lbryio/lbryschema.git
- pip install -e .[test]
script: HOME=/tmp coverage run --source=lbrynet -m twisted.trial --reactor=asyncio tests.unit
after_success:
- bash <(curl -s https://codecov.io/bash)
- <<: *tests
name: "Unit Tests w/ Python 3.6"
python: "3.6"
- <<: *tests
name: "DHT Tests w/ Python 3.7"
script: HOME=/tmp coverage run --source=lbrynet -m twisted.trial --reactor=asyncio tests.functional
- <<: *tests
name: "DHT Tests w/ Python 3.6"
python: "3.6"
script: HOME=/tmp coverage run --source=lbrynet -m twisted.trial --reactor=asyncio tests.functional
- name: "Integration Tests"
install:
- pip install tox-travis coverage
- pushd .. && git clone https://github.com/lbryio/electrumx.git --branch lbryumx && popd
- pushd .. && git clone https://github.com/lbryio/orchstr8.git && popd
- pushd .. && git clone https://github.com/lbryio/lbryschema.git && popd
- pushd .. && git clone https://github.com/lbryio/lbryumx.git && cd lbryumx && git checkout afd34f323dd94c516108a65240f7d17aea8efe85 && cd .. && popd
- pushd .. && git clone https://github.com/lbryio/torba.git && popd
script: tox
after_success:
- coverage combine tests/
- bash <(curl -s https://codecov.io/bash)
- stage: build
name: "Windows"
language: generic
services:
- docker
install:
- docker pull cdrx/pyinstaller-windows:python3-32bit
script:
- docker run -v "$(pwd):/src/lbry" cdrx/pyinstaller-windows:python3-32bit lbry/scripts/wine_build.sh
addons:
artifacts:
working_dir: dist
paths:
- lbrynet.exe
target_paths:
- /daemon/build-${TRAVIS_BUILD_NUMBER}_commit-${TRAVIS_COMMIT:0:7}_branch-${TRAVIS_BRANCH}$([ ! -z ${TRAVIS_TAG} ] && echo _tag-${TRAVIS_TAG})/win/
- &build
name: "Linux"
python: "3.6"
install:
- pip3 install pyinstaller
- pip3 install git+https://github.com/lbryio/torba.git
- pip3 install git+https://github.com/lbryio/lbryschema.git
- pip3 install -e .
script:
- pyinstaller -F -n lbrynet lbrynet/cli.py
- ./dist/lbrynet --version
env: OS=linux
addons:
artifacts:
working_dir: dist
paths:
- lbrynet
# artifact uploader thinks lbrynet is a directory, https://github.com/travis-ci/artifacts/issues/78
target_paths:
- /daemon/build-${TRAVIS_BUILD_NUMBER}_commit-${TRAVIS_COMMIT:0:7}_branch-${TRAVIS_BRANCH}$([ ! -z ${TRAVIS_TAG} ] && echo _tag-${TRAVIS_TAG})/${OS}/lbrynet
- <<: *build
name: "Mac"
os: osx
osx_image: xcode9.4
language: generic
env: OS=mac
cache: cache:
directories: directories:
- $HOME/.cache/pip - $HOME/.cache/pip
- $HOME/Library/Caches/pip - $HOME/Library/Caches/pip
- $TRAVIS_BUILD_DIR/cache/wheel - $TRAVIS_BUILD_DIR/.tox
addons:
#srcclr:
# debug: true
apt:
packages:
- libgmp3-dev
- build-essential
- git
- libssl-dev
- libffi-dev
before_install:
- virtualenv venv
- source venv/bin/activate
install:
- pip install -U pip==9.0.3
- pip install -r requirements.txt
- pip install -r requirements_testing.txt
- pip install .
script:
- pip install mock pylint
- pylint lbrynet
- PYTHONPATH=. trial lbrynet.tests
- rvm install ruby-2.3.1
- rvm use 2.3.1 && gem install danger --version '~> 4.0' && danger

View file

@ -9,7 +9,7 @@ at anytime.
## [Unreleased] ## [Unreleased]
### Security ### Security
* * Upgraded `cryptography` package.
* *
### Fixed ### Fixed
@ -21,15 +21,18 @@ at anytime.
* *
### Changed ### Changed
* * Ported to Python 3 without backwards compatibility with Python 2.
* * Switched to a brand new wallet implementation: torba.
* Format of wallet has changed to support multiple accounts in one wallet.
### Added ### Added
* * `fund` command, used to move funds between or within an account in various ways.
* * `max_address_gap` command, for finding large gaps of unused addresses
* `balance` command, a more detailed version `wallet_balace` which includes all accounts.
* `account` command, adding/deleting/modifying accounts including setting the default account.
### Removed ### Removed
* * `send_amount_to_address` command, which was previously marked as deprecated
* *

View file

@ -1,33 +0,0 @@
$env:Path += ";C:\MinGW\bin\"
$env:Path += ";C:\Program Files (x86)\Windows Kits\10\bin\x86\"
gcc --version
mingw32-make --version
# build/install miniupnpc manually
tar zxf miniupnpc-1.9.tar.gz
cd miniupnpc-1.9
mingw32-make -f Makefile.mingw
python setupmingw32.py build --compiler=mingw32
python setupmingw32.py install
cd ..\
Remove-Item -Recurse -Force miniupnpc-1.9
# copy requirements from lbry, but remove miniupnpc (installed manually)
Get-Content ..\requirements.txt | Select-String -Pattern 'miniupnpc' -NotMatch | Out-File requirements_base.txt
python set_build.py
pip install -r requirements.txt
pip install ..\.
pyinstaller -y daemon.onefile.spec
pyinstaller -y cli.onefile.spec
pyinstaller -y console.onefile.spec
nuget install secure-file -ExcludeVersion
secure-file\tools\secure-file -decrypt .\lbry2.pfx.enc -secret "$env:pfx_key"
signtool.exe sign /f .\lbry2.pfx /p "$env:key_pass" /tr http://tsa.starfieldtech.com /td SHA256 /fd SHA256 dist\*.exe
python zip_daemon.py
python upload_assets.py

View file

@ -1,59 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
ROOT="$( cd "$( dirname "${BASH_SOURCE[0]}" )/.." && pwd )"
cd "$ROOT"
BUILD_DIR="$ROOT/build"
FULL_BUILD="${FULL_BUILD:-false}"
if [ -n "${TEAMCITY_VERSION:-}" -o -n "${APPVEYOR:-}" ]; then
FULL_BUILD="true"
fi
[ -d "$BUILD_DIR/bulid" ] && rm -rf "$BUILD_DIR/build"
[ -d "$BUILD_DIR/dist" ] && rm -rf "$BUILD_DIR/dist"
if [ "$FULL_BUILD" == "true" ]; then
# install dependencies
$BUILD_DIR/prebuild.sh
VENV="$BUILD_DIR/venv"
if [ -d "$VENV" ]; then
rm -rf "$VENV"
fi
virtualenv "$VENV"
set +u
source "$VENV/bin/activate"
set -u
# must set build before installing lbrynet. otherwise it has no effect
python "$BUILD_DIR/set_build.py"
fi
cp "$ROOT/requirements.txt" "$BUILD_DIR/requirements_base.txt"
(
cd "$BUILD_DIR"
pip install -r requirements.txt
)
(
cd "$BUILD_DIR"
pyinstaller -y daemon.onefile.spec
pyinstaller -y cli.onefile.spec
pyinstaller -y console.onefile.spec
)
python "$BUILD_DIR/zip_daemon.py"
if [ "$FULL_BUILD" == "true" ]; then
# electron-build has a publish feature, but I had a hard time getting
# it to reliably work and it also seemed difficult to configure. Not proud of
# this, but it seemed better to write my own.
python "$BUILD_DIR/upload_assets.py"
deactivate
fi
echo 'Build complete.'

View file

@ -1,42 +0,0 @@
# -*- mode: python -*-
import platform
import os
dir = 'build';
cwd = os.getcwd()
if os.path.basename(cwd) != dir:
raise Exception('pyinstaller build needs to be run from the ' + dir + ' directory')
repo_base = os.path.abspath(os.path.join(cwd, '..'))
execfile(os.path.join(cwd, "entrypoint.py")) # ghetto import
system = platform.system()
if system == 'Darwin':
icns = os.path.join(repo_base, 'build', 'icon.icns')
elif system == 'Linux':
icns = os.path.join(repo_base, 'build', 'icons', '256x256.png')
elif system == 'Windows':
icns = os.path.join(repo_base, 'build', 'icons', 'lbry256.ico')
else:
print 'Warning: System {} has no icons'.format(system)
icns = None
a = Entrypoint('lbrynet', 'console_scripts', 'lbrynet-cli', pathex=[cwd])
pyz = PYZ(a.pure, a.zipped_data)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.zipfiles,
a.datas,
name='lbrynet-cli',
debug=False,
strip=False,
upx=True,
console=True,
icon=icns
)

View file

@ -1,50 +0,0 @@
# -*- mode: python -*-
import platform
import os
import lbryum
dir = 'build';
cwd = os.getcwd()
if os.path.basename(cwd) != dir:
raise Exception('pyinstaller build needs to be run from the ' + dir + ' directory')
repo_base = os.path.abspath(os.path.join(cwd, '..'))
execfile(os.path.join(cwd, "entrypoint.py")) # ghetto import
system = platform.system()
if system == 'Darwin':
icns = os.path.join(repo_base, 'build', 'icon.icns')
elif system == 'Linux':
icns = os.path.join(repo_base, 'build', 'icons', '256x256.png')
elif system == 'Windows':
icns = os.path.join(repo_base, 'build', 'icons', 'lbry256.ico')
else:
print 'Warning: System {} has no icons'.format(system)
icns = None
datas = [
(os.path.join(os.path.dirname(lbryum.__file__), 'wordlist', language + '.txt'), 'lbryum/wordlist')
for language in ('chinese_simplified', 'japanese', 'spanish','english', 'portuguese')
]
a = Entrypoint('lbrynet', 'console_scripts', 'lbrynet-console', pathex=[cwd], datas=datas)
pyz = PYZ(a.pure, a.zipped_data)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.zipfiles,
a.datas,
name='lbrynet-console',
debug=False,
strip=False,
upx=True,
console=True,
icon=icns
)

View file

@ -1,50 +0,0 @@
# -*- mode: python -*-
import platform
import os
import lbryum
dir = 'build';
cwd = os.getcwd()
if os.path.basename(cwd) != dir:
raise Exception('pyinstaller build needs to be run from the ' + dir + ' directory')
repo_base = os.path.abspath(os.path.join(cwd, '..'))
execfile(os.path.join(cwd, "entrypoint.py")) # ghetto import
system = platform.system()
if system == 'Darwin':
icns = os.path.join(repo_base, 'build', 'icon.icns')
elif system == 'Linux':
icns = os.path.join(repo_base, 'build', 'icons', '256x256.png')
elif system == 'Windows':
icns = os.path.join(repo_base, 'build', 'icons', 'lbry256.ico')
else:
print 'Warning: System {} has no icons'.format(system)
icns = None
datas = [
(os.path.join(os.path.dirname(lbryum.__file__), 'wordlist', language + '.txt'), 'lbryum/wordlist')
for language in ('chinese_simplified', 'japanese', 'spanish','english', 'portuguese')
]
a = Entrypoint('lbrynet', 'console_scripts', 'lbrynet-daemon', pathex=[cwd], datas=datas)
pyz = PYZ(a.pure, a.zipped_data)
exe = EXE(
pyz,
a.scripts,
a.binaries,
a.zipfiles,
a.datas,
name='lbrynet-daemon',
debug=False,
strip=False,
upx=True,
console=True,
icon=icns
)

View file

@ -1,47 +0,0 @@
# https://github.com/pyinstaller/pyinstaller/wiki/Recipe-Setuptools-Entry-Point
def Entrypoint(dist, group, name,
scripts=None, pathex=None, binaries=None, datas=None,
hiddenimports=None, hookspath=None, excludes=None, runtime_hooks=None,
cipher=None, win_no_prefer_redirects=False, win_private_assemblies=False):
import pkg_resources
# get toplevel packages of distribution from metadata
def get_toplevel(dist):
distribution = pkg_resources.get_distribution(dist)
if distribution.has_metadata('top_level.txt'):
return list(distribution.get_metadata('top_level.txt').split())
else:
return []
hiddenimports = hiddenimports or []
packages = []
for distribution in hiddenimports:
packages += get_toplevel(distribution)
scripts = scripts or []
pathex = pathex or []
# get the entry point
ep = pkg_resources.get_entry_info(dist, group, name)
# insert path of the egg at the verify front of the search path
pathex = [ep.dist.location] + pathex
# script name must not be a valid module name to avoid name clashes on import
script_path = os.path.join(workpath, name + '-script.py')
print "creating script for entry point", dist, group, name
with open(script_path, 'w') as fh:
fh.write("import {0}\n".format(ep.module_name))
fh.write("{0}.{1}()\n".format(ep.module_name, '.'.join(ep.attrs)))
for package in packages:
fh.write("import {0}\n".format(package))
return Analysis([script_path] + scripts,
pathex=pathex,
binaries=binaries,
datas=datas,
hiddenimports=hiddenimports,
hookspath=hookspath,
excludes=excludes,
runtime_hooks=runtime_hooks,
cipher=cipher,
win_no_prefer_redirects=win_no_prefer_redirects,
win_private_assemblies=win_private_assemblies
)

Binary file not shown.

Binary file not shown.

View file

@ -1,82 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
LINUX=false
OSX=false
if [ "$(uname)" == "Darwin" ]; then
OSX=true
elif [ "$(expr substr $(uname -s) 1 5)" == "Linux" ]; then
LINUX=true
else
echo "Platform detection failed"
exit 1
fi
SUDO=''
if $LINUX && (( $EUID != 0 )); then
SUDO='sudo'
fi
cmd_exists() {
command -v "$1" >/dev/null 2>&1
return $?
}
set +eu
GITUSERNAME=$(git config --global --get user.name)
if [ -z "$GITUSERNAME" ]; then
git config --global user.name "$(whoami)"
fi
GITEMAIL=$(git config --global --get user.email)
if [ -z "$GITEMAIL" ]; then
git config --global user.email "$(whoami)@lbry.io"
fi
set -eu
if $LINUX; then
INSTALL="$SUDO apt-get install --no-install-recommends -y"
$INSTALL build-essential libssl-dev libffi-dev python2.7-dev wget
elif $OSX && ! cmd_exists brew ; then
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
fi
if ! cmd_exists python; then
if $LINUX; then
$INSTALL python2.7
elif $OSX; then
brew install python
curl https://bootstrap.pypa.io/get-pip.py | python
fi
fi
PYTHON_VERSION=$(python -c 'import sys; print(".".join(map(str, sys.version_info[:2])))')
if [ "$PYTHON_VERSION" != "2.7" ]; then
echo "Python 2.7 required"
exit 1
fi
if ! cmd_exists pip; then
if $LINUX; then
$INSTALL python-pip
$SUDO pip install --upgrade pip
else
echo "Pip required"
exit 1
fi
fi
if $LINUX && [ "$(pip list --format=columns | grep setuptools | wc -l)" -ge 1 ]; then
#$INSTALL python-setuptools
$SUDO pip install setuptools
fi
if ! cmd_exists virtualenv; then
$SUDO pip install virtualenv
fi

View file

@ -1,11 +0,0 @@
# install daemon requirements (created by build script. see build.sh, build.ps1)
-r requirements_base.txt
# install daemon itself. make sure you run `pip install` from this dir. this is how you do relative file paths with pip
file:../.
# install other build requirements
PyInstaller==3.2.1
requests[security]==2.13.0
uritemplate==3.0.0
boto3==1.4.4

View file

@ -1,29 +0,0 @@
"""Set the build version to be 'dev', 'qa', 'rc', 'release'"""
import os.path
import re
import subprocess
import sys
def main():
build = get_build()
root_dir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
with open(os.path.join(root_dir, 'lbrynet', 'build_type.py'), 'w') as f:
f.write("BUILD = '{}'\n".format(build))
def get_build():
try:
tag = subprocess.check_output(['git', 'describe', '--exact-match']).strip()
if re.match('v\d+\.\d+\.\d+rc\d+', tag):
return 'rc'
else:
return 'release'
except subprocess.CalledProcessError:
# if the build doesn't have a tag
return 'qa'
if __name__ == '__main__':
sys.exit(main())

View file

@ -1,143 +0,0 @@
import glob
import json
import os
import subprocess
import sys
import github
import uritemplate
import boto3
def main():
upload_to_github_if_tagged('lbryio/lbry')
upload_to_s3('daemon')
def get_asset_filename():
this_dir = os.path.dirname(os.path.realpath(__file__))
return glob.glob(this_dir + '/dist/*.zip')[0]
def upload_to_s3(folder):
tag = subprocess.check_output(['git', 'describe', '--always', '--abbrev=8', 'HEAD']).strip()
commit_date = subprocess.check_output([
'git', 'show', '-s', '--format=%cd', '--date=format:%Y%m%d-%H%I%S', 'HEAD']).strip()
asset_path = get_asset_filename()
bucket = 'releases.lbry.io'
key = folder + '/' + commit_date + '-' + tag + '/' + os.path.basename(asset_path)
print "Uploading " + asset_path + " to s3://" + bucket + '/' + key + ''
if 'AWS_ACCESS_KEY_ID' not in os.environ or 'AWS_SECRET_ACCESS_KEY' not in os.environ:
print 'Must set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY to publish assets to s3'
return 1
s3 = boto3.resource(
's3',
aws_access_key_id=os.environ['AWS_ACCESS_KEY_ID'],
aws_secret_access_key=os.environ['AWS_SECRET_ACCESS_KEY'],
config=boto3.session.Config(signature_version='s3v4')
)
s3.meta.client.upload_file(asset_path, bucket, key)
def upload_to_github_if_tagged(repo_name):
try:
current_tag = subprocess.check_output(
['git', 'describe', '--exact-match', 'HEAD']).strip()
except subprocess.CalledProcessError:
print 'Not uploading to GitHub as we are not currently on a tag'
return 1
print "Current tag: " + current_tag
if 'GH_TOKEN' not in os.environ:
print 'Must set GH_TOKEN in order to publish assets to a release'
return 1
gh_token = os.environ['GH_TOKEN']
auth = github.Github(gh_token)
repo = auth.get_repo(repo_name)
if not check_repo_has_tag(repo, current_tag):
print 'Tag {} is not in repo {}'.format(current_tag, repo)
# TODO: maybe this should be an error
return 1
asset_path = get_asset_filename()
print "Uploading " + asset_path + " to Github tag " + current_tag
release = get_github_release(repo, current_tag)
upload_asset_to_github(release, asset_path, gh_token)
def check_repo_has_tag(repo, target_tag):
tags = repo.get_tags().get_page(0)
for tag in tags:
if tag.name == target_tag:
return True
return False
def get_github_release(repo, current_tag):
for release in repo.get_releases():
if release.tag_name == current_tag:
return release
raise Exception('No release for {} was found'.format(current_tag))
def upload_asset_to_github(release, asset_to_upload, token):
basename = os.path.basename(asset_to_upload)
for asset in release.raw_data['assets']:
if asset['name'] == basename:
print 'File {} has already been uploaded to {}'.format(basename, release.tag_name)
return
upload_uri = uritemplate.expand(release.upload_url, {'name': basename})
count = 0
while count < 10:
try:
output = _curl_uploader(upload_uri, asset_to_upload, token)
if 'errors' in output:
raise Exception(output)
else:
print 'Successfully uploaded to {}'.format(output['browser_download_url'])
except Exception:
print 'Failed uploading on attempt {}'.format(count + 1)
count += 1
def _curl_uploader(upload_uri, asset_to_upload, token):
# using requests.post fails miserably with SSL EPIPE errors. I spent
# half a day trying to debug before deciding to switch to curl.
#
# TODO: actually set the content type
print 'Using curl to upload {} to {}'.format(asset_to_upload, upload_uri)
cmd = [
'curl',
'-sS',
'-X', 'POST',
'-u', ':{}'.format(os.environ['GH_TOKEN']),
'--header', 'Content-Type: application/octet-stream',
'--data-binary', '@-',
upload_uri
]
# '-d', '{"some_key": "some_value"}',
print 'Calling curl:'
print cmd
print
with open(asset_to_upload, 'rb') as fp:
p = subprocess.Popen(cmd, stdin=fp, stderr=subprocess.PIPE, stdout=subprocess.PIPE)
stdout, stderr = p.communicate()
print 'curl return code:', p.returncode
if stderr:
print 'stderr output from curl:'
print stderr
print 'stdout from curl:'
print stdout
return json.loads(stdout)
if __name__ == '__main__':
sys.exit(main())

View file

@ -1,29 +0,0 @@
import os
import platform
import subprocess
import sys
import zipfile
def main():
this_dir = os.path.dirname(os.path.realpath(__file__))
tag = subprocess.check_output(['git', 'describe']).strip()
zipfilename = 'lbrynet-daemon-{}-{}.zip'.format(tag, get_system_label())
full_filename = os.path.join(this_dir, 'dist', zipfilename)
executables = ['lbrynet-daemon', 'lbrynet-cli', 'lbrynet-console']
ext = '.exe' if platform.system() == 'Windows' else ''
with zipfile.ZipFile(full_filename, 'w') as myzip:
for executable in executables:
myzip.write(os.path.join(this_dir, 'dist', executable + ext), executable + ext)
def get_system_label():
system = platform.system()
if system == 'Darwin':
return 'macos'
else:
return system.lower()
if __name__ == '__main__':
sys.exit(main())

View file

Before

Width:  |  Height:  |  Size: 7.4 KiB

After

Width:  |  Height:  |  Size: 7.4 KiB

View file

Before

Width:  |  Height:  |  Size: 18 KiB

After

Width:  |  Height:  |  Size: 18 KiB

View file

Before

Width:  |  Height:  |  Size: 1.2 KiB

After

Width:  |  Height:  |  Size: 1.2 KiB

View file

Before

Width:  |  Height:  |  Size: 2.6 KiB

After

Width:  |  Height:  |  Size: 2.6 KiB

View file

Before

Width:  |  Height:  |  Size: 6.1 KiB

After

Width:  |  Height:  |  Size: 6.1 KiB

View file

Before

Width:  |  Height:  |  Size: 97 KiB

After

Width:  |  Height:  |  Size: 97 KiB

View file

Before

Width:  |  Height:  |  Size: 1.1 KiB

After

Width:  |  Height:  |  Size: 1.1 KiB

View file

Before

Width:  |  Height:  |  Size: 361 KiB

After

Width:  |  Height:  |  Size: 361 KiB

View file

Before

Width:  |  Height:  |  Size: 5.3 KiB

After

Width:  |  Height:  |  Size: 5.3 KiB

View file

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View file

Before

Width:  |  Height:  |  Size: 31 KiB

After

Width:  |  Height:  |  Size: 31 KiB

View file

@ -1,6 +1,6 @@
import logging import logging
__version__ = "0.21.2" __version__ = "0.30.0a"
version = tuple(__version__.split('.')) version = tuple(__version__.split('.'))
logging.getLogger(__name__).addHandler(logging.NullHandler()) logging.getLogger(__name__).addHandler(logging.NullHandler())

View file

@ -24,7 +24,7 @@ BLOB_BYTES_UPLOADED = 'Blob Bytes Uploaded'
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class Manager(object): class Manager:
def __init__(self, analytics_api, context=None, installation_id=None, session_id=None): def __init__(self, analytics_api, context=None, installation_id=None, session_id=None):
self.analytics_api = analytics_api self.analytics_api = analytics_api
self._tracked_data = collections.defaultdict(list) self._tracked_data = collections.defaultdict(list)
@ -158,7 +158,7 @@ class Manager(object):
@staticmethod @staticmethod
def _download_properties(id_, name, claim_dict=None, report=None): def _download_properties(id_, name, claim_dict=None, report=None):
sd_hash = None if not claim_dict else claim_dict.source_hash sd_hash = None if not claim_dict else claim_dict.source_hash.decode()
p = { p = {
'download_id': id_, 'download_id': id_,
'name': name, 'name': name,
@ -177,9 +177,9 @@ class Manager(object):
return { return {
'download_id': id_, 'download_id': id_,
'name': name, 'name': name,
'stream_info': claim_dict.source_hash, 'stream_info': claim_dict.source_hash.decode(),
'error': error_name(error), 'error': error_name(error),
'reason': error.message, 'reason': str(error),
'report': report 'report': report
} }
@ -193,7 +193,7 @@ class Manager(object):
'build': platform['build'], 'build': platform['build'],
'wallet': { 'wallet': {
'name': wallet, 'name': wallet,
'version': platform['lbryum_version'] if wallet == conf.LBRYUM_WALLET else None 'version': platform['lbrynet_version']
}, },
}, },
# TODO: expand os info to give linux/osx specific info # TODO: expand os info to give linux/osx specific info
@ -219,7 +219,7 @@ class Manager(object):
callback(maybe_deferred, *args, **kwargs) callback(maybe_deferred, *args, **kwargs)
class Api(object): class Api:
def __init__(self, cookies, url, write_key, enabled): def __init__(self, cookies, url, write_key, enabled):
self.cookies = cookies self.cookies = cookies
self.url = url self.url = url

View file

@ -1 +1 @@
import paths from . import paths

View file

@ -1,4 +1,4 @@
from blob_file import BlobFile from .blob_file import BlobFile
from creator import BlobFileCreator from .creator import BlobFileCreator
from writer import HashBlobWriter from .writer import HashBlobWriter
from reader import HashBlobReader from .reader import HashBlobReader

View file

@ -13,7 +13,7 @@ log = logging.getLogger(__name__)
MAX_BLOB_SIZE = 2 * 2 ** 20 MAX_BLOB_SIZE = 2 * 2 ** 20
class BlobFile(object): class BlobFile:
""" """
A chunk of data available on the network which is specified by a hashsum A chunk of data available on the network which is specified by a hashsum
@ -60,12 +60,12 @@ class BlobFile(object):
finished_deferred - deferred that is fired when write is finished and returns finished_deferred - deferred that is fired when write is finished and returns
a instance of itself as HashBlob a instance of itself as HashBlob
""" """
if not peer in self.writers: if peer not in self.writers:
log.debug("Opening %s to be written by %s", str(self), str(peer)) log.debug("Opening %s to be written by %s", str(self), str(peer))
finished_deferred = defer.Deferred() finished_deferred = defer.Deferred()
writer = HashBlobWriter(self.get_length, self.writer_finished) writer = HashBlobWriter(self.get_length, self.writer_finished)
self.writers[peer] = (writer, finished_deferred) self.writers[peer] = (writer, finished_deferred)
return (writer, finished_deferred) return writer, finished_deferred
log.warning("Tried to download the same file twice simultaneously from the same peer") log.warning("Tried to download the same file twice simultaneously from the same peer")
return None, None return None, None
@ -149,7 +149,7 @@ class BlobFile(object):
def writer_finished(self, writer, err=None): def writer_finished(self, writer, err=None):
def fire_finished_deferred(): def fire_finished_deferred():
self._verified = True self._verified = True
for p, (w, finished_deferred) in self.writers.items(): for p, (w, finished_deferred) in list(self.writers.items()):
if w == writer: if w == writer:
del self.writers[p] del self.writers[p]
finished_deferred.callback(self) finished_deferred.callback(self)
@ -160,7 +160,7 @@ class BlobFile(object):
return False return False
def errback_finished_deferred(err): def errback_finished_deferred(err):
for p, (w, finished_deferred) in self.writers.items(): for p, (w, finished_deferred) in list(self.writers.items()):
if w == writer: if w == writer:
del self.writers[p] del self.writers[p]
finished_deferred.errback(err) finished_deferred.errback(err)

View file

@ -8,7 +8,7 @@ from lbrynet.core.cryptoutils import get_lbry_hash_obj
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class BlobFileCreator(object): class BlobFileCreator:
""" """
This class is used to create blobs on the local filesystem This class is used to create blobs on the local filesystem
when we do not know the blob hash beforehand (i.e, when creating when we do not know the blob hash beforehand (i.e, when creating

View file

@ -3,7 +3,7 @@ import logging
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class HashBlobReader(object): class HashBlobReader:
""" """
This is a file like reader class that supports This is a file like reader class that supports
read(size) and close() read(size) and close()
@ -15,7 +15,7 @@ class HashBlobReader(object):
def __del__(self): def __del__(self):
if self.finished_cb_d is None: if self.finished_cb_d is None:
log.warn("Garbage collection was called, but reader for %s was not closed yet", log.warning("Garbage collection was called, but reader for %s was not closed yet",
self.read_handle.name) self.read_handle.name)
self.close() self.close()
@ -28,5 +28,3 @@ class HashBlobReader(object):
return return
self.read_handle.close() self.read_handle.close()
self.finished_cb_d = self.finished_cb(self) self.finished_cb_d = self.finished_cb(self)

View file

@ -7,7 +7,7 @@ from lbrynet.core.cryptoutils import get_lbry_hash_obj
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class HashBlobWriter(object): class HashBlobWriter:
def __init__(self, length_getter, finished_cb): def __init__(self, length_getter, finished_cb):
self.write_handle = BytesIO() self.write_handle = BytesIO()
self.length_getter = length_getter self.length_getter = length_getter
@ -18,7 +18,7 @@ class HashBlobWriter(object):
def __del__(self): def __del__(self):
if self.finished_cb_d is None: if self.finished_cb_d is None:
log.warn("Garbage collection was called, but writer was not closed yet") log.warning("Garbage collection was called, but writer was not closed yet")
self.close() self.close()
@property @property

162
lbrynet/cli.py Normal file
View file

@ -0,0 +1,162 @@
import sys
from twisted.internet import asyncioreactor
if 'twisted.internet.reactor' not in sys.modules:
asyncioreactor.install()
else:
from twisted.internet import reactor
if not isinstance(reactor, asyncioreactor.AsyncioSelectorReactor):
# pyinstaller hooks install the default reactor before
# any of our code runs, see kivy for similar problem:
# https://github.com/kivy/kivy/issues/4182
del sys.modules['twisted.internet.reactor']
asyncioreactor.install()
import json
import asyncio
from aiohttp.client_exceptions import ClientConnectorError
from requests.exceptions import ConnectionError
from docopt import docopt
from textwrap import dedent
from lbrynet.daemon.Daemon import Daemon
from lbrynet.daemon.DaemonControl import start as daemon_main
from lbrynet.daemon.DaemonConsole import main as daemon_console
from lbrynet.daemon.auth.client import LBRYAPIClient
from lbrynet.core.system_info import get_platform
async def execute_command(method, params, conf_path=None):
# this check if the daemon is running or not
try:
api = await LBRYAPIClient.get_client(conf_path)
await api.status()
except (ClientConnectorError, ConnectionError):
await api.session.close()
print("Could not connect to daemon. Are you sure it's running?")
return 1
# this actually executes the method
try:
resp = await api.call(method, params)
await api.session.close()
print(json.dumps(resp["result"], indent=2))
except KeyError:
if resp["error"]["code"] == -32500:
print(json.dumps(resp["error"], indent=2))
else:
print(json.dumps(resp["error"]["message"], indent=2))
def print_help():
print(dedent("""
NAME
lbrynet - LBRY command line client.
USAGE
lbrynet [--conf <config file>] <command> [<args>]
EXAMPLES
lbrynet commands # list available commands
lbrynet status # get daemon status
lbrynet --conf ~/l1.conf status # like above but using ~/l1.conf as config file
lbrynet resolve_name what # resolve a name
lbrynet help resolve_name # get help for a command
"""))
def print_help_for_command(command):
fn = Daemon.callable_methods.get(command)
if fn:
print(dedent(fn.__doc__))
else:
print("Invalid command name")
def normalize_value(x, key=None):
if not isinstance(x, str):
return x
if key in ('uri', 'channel_name', 'name', 'file_name', 'download_directory'):
return x
if x.lower() == 'true':
return True
if x.lower() == 'false':
return False
if x.isdigit():
return int(x)
return x
def remove_brackets(key):
if key.startswith("<") and key.endswith(">"):
return str(key[1:-1])
return key
def set_kwargs(parsed_args):
kwargs = {}
for key, arg in parsed_args.items():
k = None
if arg is None:
continue
elif key.startswith("--") and remove_brackets(key[2:]) not in kwargs:
k = remove_brackets(key[2:])
elif remove_brackets(key) not in kwargs:
k = remove_brackets(key)
kwargs[k] = normalize_value(arg, k)
return kwargs
def main(argv=None):
argv = argv or sys.argv[1:]
if not argv:
print_help()
return 1
conf_path = None
if len(argv) and argv[0] == "--conf":
if len(argv) < 2:
print("No config file specified for --conf option")
print_help()
return 1
conf_path = argv[1]
argv = argv[2:]
method, args = argv[0], argv[1:]
if method in ['help', '--help', '-h']:
if len(args) == 1:
print_help_for_command(args[0])
else:
print_help()
return 0
elif method in ['version', '--version', '-v']:
print(json.dumps(get_platform(get_ip=False), sort_keys=True, indent=2, separators=(',', ': ')))
return 0
elif method == 'start':
sys.exit(daemon_main(args, conf_path))
elif method == 'console':
sys.exit(daemon_console())
elif method not in Daemon.callable_methods:
if method not in Daemon.deprecated_methods:
print('{} is not a valid command.'.format(method))
return 1
new_method = Daemon.deprecated_methods[method].new_command
print("{} is deprecated, using {}.".format(method, new_method))
method = new_method
fn = Daemon.callable_methods[method]
parsed = docopt(fn.__doc__, args)
params = set_kwargs(parsed)
loop = asyncio.get_event_loop()
loop.run_until_complete(execute_command(method, params, conf_path))
return 0
if __name__ == "__main__":
sys.exit(main())

View file

@ -29,6 +29,7 @@ ENV_NAMESPACE = 'LBRY_'
LBRYCRD_WALLET = 'lbrycrd' LBRYCRD_WALLET = 'lbrycrd'
LBRYUM_WALLET = 'lbryum' LBRYUM_WALLET = 'lbryum'
PTC_WALLET = 'ptc' PTC_WALLET = 'ptc'
TORBA_WALLET = 'torba'
PROTOCOL_PREFIX = 'lbry' PROTOCOL_PREFIX = 'lbry'
APP_NAME = 'LBRY' APP_NAME = 'LBRY'
@ -62,22 +63,6 @@ settings_encoders = {
conf_file = None conf_file = None
def _win_path_to_bytes(path):
"""
Encode Windows paths to string. appdirs.user_data_dir()
on windows will return unicode path, unlike other platforms
which returns string. This will cause problems
because we use strings for filenames and combining them with
os.path.join() will result in errors.
"""
for encoding in ('ASCII', 'MBCS'):
try:
return path.encode(encoding)
except (UnicodeEncodeError, LookupError):
pass
return path
def _get_old_directories(platform_type): def _get_old_directories(platform_type):
directories = {} directories = {}
if platform_type == WINDOWS: if platform_type == WINDOWS:
@ -142,9 +127,6 @@ elif 'win' in sys.platform:
dirs = _get_old_directories(WINDOWS) dirs = _get_old_directories(WINDOWS)
else: else:
dirs = _get_new_directories(WINDOWS) dirs = _get_new_directories(WINDOWS)
dirs['data'] = _win_path_to_bytes(dirs['data'])
dirs['lbryum'] = _win_path_to_bytes(dirs['lbryum'])
dirs['download'] = _win_path_to_bytes(dirs['download'])
else: else:
platform = LINUX platform = LINUX
if os.path.isdir(_get_old_directories(LINUX)['data']) or \ if os.path.isdir(_get_old_directories(LINUX)['data']) or \
@ -182,11 +164,11 @@ class Env(envparse.Env):
self._convert_key(key): self._convert_value(value) self._convert_key(key): self._convert_value(value)
for key, value in schema.items() for key, value in schema.items()
} }
envparse.Env.__init__(self, **my_schema) super().__init__(**my_schema)
def __call__(self, key, *args, **kwargs): def __call__(self, key, *args, **kwargs):
my_key = self._convert_key(key) my_key = self._convert_key(key)
return super(Env, self).__call__(my_key, *args, **kwargs) return super().__call__(my_key, *args, **kwargs)
@staticmethod @staticmethod
def _convert_key(key): def _convert_key(key):
@ -307,12 +289,12 @@ ADJUSTABLE_SETTINGS = {
} }
class Config(object): class Config:
def __init__(self, fixed_defaults, adjustable_defaults, persisted_settings=None, def __init__(self, fixed_defaults, adjustable_defaults, persisted_settings=None,
environment=None, cli_settings=None): environment=None, cli_settings=None):
self._installation_id = None self._installation_id = None
self._session_id = base58.b58encode(utils.generate_id()) self._session_id = base58.b58encode(utils.generate_id()).decode()
self._node_id = None self._node_id = None
self._fixed_defaults = fixed_defaults self._fixed_defaults = fixed_defaults
@ -338,7 +320,7 @@ class Config(object):
self._data[TYPE_DEFAULT].update(self._fixed_defaults) self._data[TYPE_DEFAULT].update(self._fixed_defaults)
self._data[TYPE_DEFAULT].update( self._data[TYPE_DEFAULT].update(
{k: v[1] for (k, v) in self._adjustable_defaults.iteritems()}) {k: v[1] for (k, v) in self._adjustable_defaults.items()})
if persisted_settings is None: if persisted_settings is None:
persisted_settings = {} persisted_settings = {}
@ -358,7 +340,7 @@ class Config(object):
return self.get_current_settings_dict().__repr__() return self.get_current_settings_dict().__repr__()
def __iter__(self): def __iter__(self):
for k in self._data[TYPE_DEFAULT].iterkeys(): for k in self._data[TYPE_DEFAULT].keys():
yield k yield k
def __getitem__(self, name): def __getitem__(self, name):
@ -481,7 +463,7 @@ class Config(object):
self._data[data_type][name] = value self._data[data_type][name] = value
def update(self, updated_settings, data_types=(TYPE_RUNTIME,)): def update(self, updated_settings, data_types=(TYPE_RUNTIME,)):
for k, v in updated_settings.iteritems(): for k, v in updated_settings.items():
try: try:
self.set(k, v, data_types=data_types) self.set(k, v, data_types=data_types)
except (KeyError, AssertionError): except (KeyError, AssertionError):
@ -495,7 +477,7 @@ class Config(object):
def get_adjustable_settings_dict(self): def get_adjustable_settings_dict(self):
return { return {
key: val for key, val in self.get_current_settings_dict().iteritems() key: val for key, val in self.get_current_settings_dict().items()
if key in self._adjustable_defaults if key in self._adjustable_defaults
} }
@ -516,7 +498,7 @@ class Config(object):
@staticmethod @staticmethod
def _convert_conf_file_lists_reverse(converted): def _convert_conf_file_lists_reverse(converted):
rev = {} rev = {}
for k in converted.iterkeys(): for k in converted.keys():
if k in ADJUSTABLE_SETTINGS and len(ADJUSTABLE_SETTINGS[k]) == 4: if k in ADJUSTABLE_SETTINGS and len(ADJUSTABLE_SETTINGS[k]) == 4:
rev[k] = ADJUSTABLE_SETTINGS[k][3](converted[k]) rev[k] = ADJUSTABLE_SETTINGS[k][3](converted[k])
else: else:
@ -526,7 +508,7 @@ class Config(object):
@staticmethod @staticmethod
def _convert_conf_file_lists(decoded): def _convert_conf_file_lists(decoded):
converted = {} converted = {}
for k, v in decoded.iteritems(): for k, v in decoded.items():
if k in ADJUSTABLE_SETTINGS and len(ADJUSTABLE_SETTINGS[k]) >= 3: if k in ADJUSTABLE_SETTINGS and len(ADJUSTABLE_SETTINGS[k]) >= 3:
converted[k] = ADJUSTABLE_SETTINGS[k][2](v) converted[k] = ADJUSTABLE_SETTINGS[k][2](v)
else: else:
@ -570,7 +552,7 @@ class Config(object):
if 'share_debug_info' in settings_dict: if 'share_debug_info' in settings_dict:
settings_dict['share_usage_data'] = settings_dict['share_debug_info'] settings_dict['share_usage_data'] = settings_dict['share_debug_info']
del settings_dict['share_debug_info'] del settings_dict['share_debug_info']
for key in settings_dict.keys(): for key in list(settings_dict.keys()):
if not self._is_valid_setting(key): if not self._is_valid_setting(key):
log.warning('Ignoring invalid conf file setting: %s', key) log.warning('Ignoring invalid conf file setting: %s', key)
del settings_dict[key] del settings_dict[key]
@ -618,7 +600,7 @@ class Config(object):
with open(install_id_filename, "r") as install_id_file: with open(install_id_filename, "r") as install_id_file:
self._installation_id = str(install_id_file.read()).strip() self._installation_id = str(install_id_file.read()).strip()
if not self._installation_id: if not self._installation_id:
self._installation_id = base58.b58encode(utils.generate_id()) self._installation_id = base58.b58encode(utils.generate_id()).decode()
with open(install_id_filename, "w") as install_id_file: with open(install_id_filename, "w") as install_id_file:
install_id_file.write(self._installation_id) install_id_file.write(self._installation_id)
return self._installation_id return self._installation_id
@ -632,20 +614,19 @@ class Config(object):
if not self._node_id: if not self._node_id:
self._node_id = utils.generate_id() self._node_id = utils.generate_id()
with open(node_id_filename, "w") as node_id_file: with open(node_id_filename, "w") as node_id_file:
node_id_file.write(base58.b58encode(self._node_id)) node_id_file.write(base58.b58encode(self._node_id).decode())
return self._node_id return self._node_id
def get_session_id(self): def get_session_id(self):
return self._session_id return self._session_id
# type: Config settings = None # type: Config
settings = None
def get_default_env(): def get_default_env():
env_defaults = {} env_defaults = {}
for k, v in ADJUSTABLE_SETTINGS.iteritems(): for k, v in ADJUSTABLE_SETTINGS.items():
if len(v) == 3: if len(v) == 3:
env_defaults[k] = (v[0], None, v[2]) env_defaults[k] = (v[0], None, v[2])
elif len(v) == 4: elif len(v) == 4:

View file

@ -9,7 +9,7 @@ from decimal import Decimal
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class BlobAvailabilityTracker(object): class BlobAvailabilityTracker:
""" """
Class to track peer counts for known blobs, and to discover new popular blobs Class to track peer counts for known blobs, and to discover new popular blobs

View file

@ -1,4 +1,4 @@
class BlobInfo(object): class BlobInfo:
""" """
This structure is used to represent the metadata of a blob. This structure is used to represent the metadata of a blob.
@ -16,4 +16,3 @@ class BlobInfo(object):
self.blob_hash = blob_hash self.blob_hash = blob_hash
self.blob_num = blob_num self.blob_num = blob_num
self.length = length self.length = length

View file

@ -1,5 +1,6 @@
import logging import logging
import os import os
from binascii import unhexlify
from sqlite3 import IntegrityError from sqlite3 import IntegrityError
from twisted.internet import threads, defer from twisted.internet import threads, defer
from lbrynet.blob.blob_file import BlobFile from lbrynet.blob.blob_file import BlobFile
@ -8,7 +9,7 @@ from lbrynet.blob.creator import BlobFileCreator
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class DiskBlobManager(object): class DiskBlobManager:
def __init__(self, blob_dir, storage, node_datastore=None): def __init__(self, blob_dir, storage, node_datastore=None):
""" """
This class stores blobs on the hard disk This class stores blobs on the hard disk
@ -60,7 +61,7 @@ class DiskBlobManager(object):
blob.blob_hash, blob.length, next_announce_time, should_announce blob.blob_hash, blob.length, next_announce_time, should_announce
) )
if self._node_datastore is not None: if self._node_datastore is not None:
self._node_datastore.completed_blobs.add(blob.blob_hash.decode('hex')) self._node_datastore.completed_blobs.add(unhexlify(blob.blob_hash))
def completed_blobs(self, blobhashes_to_check): def completed_blobs(self, blobhashes_to_check):
return self._completed_blobs(blobhashes_to_check) return self._completed_blobs(blobhashes_to_check)
@ -100,7 +101,7 @@ class DiskBlobManager(object):
continue continue
if self._node_datastore is not None: if self._node_datastore is not None:
try: try:
self._node_datastore.completed_blobs.remove(blob_hash.decode('hex')) self._node_datastore.completed_blobs.remove(unhexlify(blob_hash))
except KeyError: except KeyError:
pass pass
try: try:
@ -113,7 +114,7 @@ class DiskBlobManager(object):
try: try:
yield self.storage.delete_blobs_from_db(bh_to_delete_from_db) yield self.storage.delete_blobs_from_db(bh_to_delete_from_db)
except IntegrityError as err: except IntegrityError as err:
if err.message != "FOREIGN KEY constraint failed": if str(err) != "FOREIGN KEY constraint failed":
raise err raise err
@defer.inlineCallbacks @defer.inlineCallbacks

View file

@ -1,4 +1,4 @@
class DownloadOptionChoice(object): class DownloadOptionChoice:
"""A possible choice that can be picked for some option. """A possible choice that can be picked for some option.
An option can have one or more choices that can be picked from. An option can have one or more choices that can be picked from.
@ -10,7 +10,7 @@ class DownloadOptionChoice(object):
self.bool_options_description = bool_options_description self.bool_options_description = bool_options_description
class DownloadOption(object): class DownloadOption:
"""An option for a user to select a value from several different choices.""" """An option for a user to select a value from several different choices."""
def __init__(self, option_types, long_description, short_description, default_value, def __init__(self, option_types, long_description, short_description, default_value,
default_value_description): default_value_description):

View file

@ -1,3 +1,7 @@
class RPCError(Exception):
code = 0
class PriceDisagreementError(Exception): class PriceDisagreementError(Exception):
pass pass
@ -12,19 +16,19 @@ class DownloadCanceledError(Exception):
class DownloadSDTimeout(Exception): class DownloadSDTimeout(Exception):
def __init__(self, download): def __init__(self, download):
Exception.__init__(self, 'Failed to download sd blob {} within timeout'.format(download)) super().__init__('Failed to download sd blob {} within timeout'.format(download))
self.download = download self.download = download
class DownloadTimeoutError(Exception): class DownloadTimeoutError(Exception):
def __init__(self, download): def __init__(self, download):
Exception.__init__(self, 'Failed to download {} within timeout'.format(download)) super().__init__('Failed to download {} within timeout'.format(download))
self.download = download self.download = download
class DownloadDataTimeout(Exception): class DownloadDataTimeout(Exception):
def __init__(self, download): def __init__(self, download):
Exception.__init__(self, 'Failed to download data blobs for sd hash ' super().__init__('Failed to download data blobs for sd hash '
'{} within timeout'.format(download)) '{} within timeout'.format(download))
self.download = download self.download = download
@ -41,8 +45,8 @@ class NullFundsError(Exception):
pass pass
class InsufficientFundsError(Exception): class InsufficientFundsError(RPCError):
pass code = -310
class ConnectionClosedBeforeResponseError(Exception): class ConnectionClosedBeforeResponseError(Exception):
@ -55,39 +59,41 @@ class KeyFeeAboveMaxAllowed(Exception):
class InvalidExchangeRateResponse(Exception): class InvalidExchangeRateResponse(Exception):
def __init__(self, source, reason): def __init__(self, source, reason):
Exception.__init__(self, 'Failed to get exchange rate from {}:{}'.format(source, reason)) super().__init__('Failed to get exchange rate from {}:{}'.format(source, reason))
self.source = source self.source = source
self.reason = reason self.reason = reason
class UnknownNameError(Exception): class UnknownNameError(Exception):
def __init__(self, name): def __init__(self, name):
Exception.__init__(self, 'Name {} is unknown'.format(name)) super().__init__('Name {} is unknown'.format(name))
self.name = name self.name = name
class UnknownClaimID(Exception): class UnknownClaimID(Exception):
def __init__(self, claim_id): def __init__(self, claim_id):
Exception.__init__(self, 'Claim {} is unknown'.format(claim_id)) super().__init__('Claim {} is unknown'.format(claim_id))
self.claim_id = claim_id self.claim_id = claim_id
class UnknownURI(Exception): class UnknownURI(Exception):
def __init__(self, uri): def __init__(self, uri):
Exception.__init__(self, 'URI {} cannot be resolved'.format(uri)) super().__init__('URI {} cannot be resolved'.format(uri))
self.name = uri self.name = uri
class UnknownOutpoint(Exception): class UnknownOutpoint(Exception):
def __init__(self, outpoint): def __init__(self, outpoint):
Exception.__init__(self, 'Outpoint {} cannot be resolved'.format(outpoint)) super().__init__('Outpoint {} cannot be resolved'.format(outpoint))
self.outpoint = outpoint self.outpoint = outpoint
class InvalidName(Exception): class InvalidName(Exception):
def __init__(self, name, invalid_characters): def __init__(self, name, invalid_characters):
self.name = name self.name = name
self.invalid_characters = invalid_characters self.invalid_characters = invalid_characters
Exception.__init__( super().__init__(
self, 'URI contains invalid characters: {}'.format(','.join(invalid_characters))) 'URI contains invalid characters: {}'.format(','.join(invalid_characters)))
class UnknownStreamTypeError(Exception): class UnknownStreamTypeError(Exception):
@ -105,7 +111,7 @@ class InvalidStreamDescriptorError(Exception):
class InvalidStreamInfoError(Exception): class InvalidStreamInfoError(Exception):
def __init__(self, name, stream_info): def __init__(self, name, stream_info):
msg = '{} has claim with invalid stream info: {}'.format(name, stream_info) msg = '{} has claim with invalid stream info: {}'.format(name, stream_info)
Exception.__init__(self, msg) super().__init__(msg)
self.name = name self.name = name
self.stream_info = stream_info self.stream_info = stream_info
@ -159,14 +165,14 @@ class NegotiationError(Exception):
class InvalidCurrencyError(Exception): class InvalidCurrencyError(Exception):
def __init__(self, currency): def __init__(self, currency):
self.currency = currency self.currency = currency
Exception.__init__( super().__init__(
self, 'Invalid currency: {} is not a supported currency.'.format(currency)) 'Invalid currency: {} is not a supported currency.'.format(currency))
class NoSuchDirectoryError(Exception): class NoSuchDirectoryError(Exception):
def __init__(self, directory): def __init__(self, directory):
self.directory = directory self.directory = directory
Exception.__init__(self, 'No such directory {}'.format(directory)) super().__init__('No such directory {}'.format(directory))
class ComponentStartConditionNotMet(Exception): class ComponentStartConditionNotMet(Exception):

View file

@ -9,7 +9,7 @@ from lbrynet.core.Error import DownloadCanceledError
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class HTTPBlobDownloader(object): class HTTPBlobDownloader:
''' '''
A downloader that is able to get blobs from HTTP mirrors. A downloader that is able to get blobs from HTTP mirrors.
Note that when a blob gets downloaded from a mirror or from a peer, BlobManager will mark it as completed Note that when a blob gets downloaded from a mirror or from a peer, BlobManager will mark it as completed

View file

@ -1,7 +1,7 @@
from decimal import Decimal from decimal import Decimal
class Offer(object): class Offer:
"""A rate offer to download blobs from a host.""" """A rate offer to download blobs from a host."""
RATE_ACCEPTED = "RATE_ACCEPTED" RATE_ACCEPTED = "RATE_ACCEPTED"

View file

@ -3,14 +3,14 @@ from lbrynet import conf
from decimal import Decimal from decimal import Decimal
class BasePaymentRateManager(object): class BasePaymentRateManager:
def __init__(self, rate=None, info_rate=None): def __init__(self, rate=None, info_rate=None):
self.min_blob_data_payment_rate = rate if rate is not None else conf.settings['data_rate'] self.min_blob_data_payment_rate = rate if rate is not None else conf.settings['data_rate']
self.min_blob_info_payment_rate = ( self.min_blob_info_payment_rate = (
info_rate if info_rate is not None else conf.settings['min_info_rate']) info_rate if info_rate is not None else conf.settings['min_info_rate'])
class PaymentRateManager(object): class PaymentRateManager:
def __init__(self, base, rate=None): def __init__(self, base, rate=None):
""" """
@param base: a BasePaymentRateManager @param base: a BasePaymentRateManager
@ -36,7 +36,7 @@ class PaymentRateManager(object):
self.points_paid += amount self.points_paid += amount
class NegotiatedPaymentRateManager(object): class NegotiatedPaymentRateManager:
def __init__(self, base, availability_tracker, generous=None): def __init__(self, base, availability_tracker, generous=None):
""" """
@param base: a BasePaymentRateManager @param base: a BasePaymentRateManager
@ -84,7 +84,7 @@ class NegotiatedPaymentRateManager(object):
return False return False
class OnlyFreePaymentsManager(object): class OnlyFreePaymentsManager:
def __init__(self, **kwargs): def __init__(self, **kwargs):
""" """
A payment rate manager that will only ever accept and offer a rate of 0.0, A payment rate manager that will only ever accept and offer a rate of 0.0,

View file

@ -3,7 +3,7 @@ from collections import defaultdict
from lbrynet.core import utils from lbrynet.core import utils
# Do not create this object except through PeerManager # Do not create this object except through PeerManager
class Peer(object): class Peer:
def __init__(self, host, port): def __init__(self, host, port):
self.host = host self.host = host
self.port = port self.port = port

View file

@ -1,7 +1,7 @@
from lbrynet.core.Peer import Peer from lbrynet.core.Peer import Peer
class PeerManager(object): class PeerManager:
def __init__(self): def __init__(self):
self.peers = [] self.peers = []

View file

@ -9,7 +9,7 @@ def get_default_price_model(blob_tracker, base_price, **kwargs):
return MeanAvailabilityWeightedPrice(blob_tracker, base_price, **kwargs) return MeanAvailabilityWeightedPrice(blob_tracker, base_price, **kwargs)
class ZeroPrice(object): class ZeroPrice:
def __init__(self): def __init__(self):
self.base_price = 0.0 self.base_price = 0.0
@ -17,7 +17,7 @@ class ZeroPrice(object):
return 0.0 return 0.0
class MeanAvailabilityWeightedPrice(object): class MeanAvailabilityWeightedPrice:
"""Calculate mean-blob-availability and stream-position weighted price for a blob """Calculate mean-blob-availability and stream-position weighted price for a blob
Attributes: Attributes:

View file

@ -1,14 +1,12 @@
import logging import logging
from zope.interface import implements
from lbrynet.interfaces import IRateLimiter
from twisted.internet import task from twisted.internet import task
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class DummyRateLimiter(object): class DummyRateLimiter:
def __init__(self): def __init__(self):
self.dl_bytes_this_second = 0 self.dl_bytes_this_second = 0
self.ul_bytes_this_second = 0 self.ul_bytes_this_second = 0
@ -46,10 +44,10 @@ class DummyRateLimiter(object):
self.total_ul_bytes += num_bytes self.total_ul_bytes += num_bytes
class RateLimiter(object): class RateLimiter:
"""This class ensures that upload and download rates don't exceed specified maximums""" """This class ensures that upload and download rates don't exceed specified maximums"""
implements(IRateLimiter) #implements(IRateLimiter)
#called by main application #called by main application

View file

@ -19,7 +19,7 @@ log = logging.getLogger(__name__)
class SinglePeerFinder(DummyPeerFinder): class SinglePeerFinder(DummyPeerFinder):
def __init__(self, peer): def __init__(self, peer):
DummyPeerFinder.__init__(self) super().__init__()
self.peer = peer self.peer = peer
def find_peers_for_blob(self, blob_hash, timeout=None, filter_self=False): def find_peers_for_blob(self, blob_hash, timeout=None, filter_self=False):
@ -28,7 +28,7 @@ class SinglePeerFinder(DummyPeerFinder):
class BlobCallback(BlobFile): class BlobCallback(BlobFile):
def __init__(self, blob_dir, blob_hash, timeout): def __init__(self, blob_dir, blob_hash, timeout):
BlobFile.__init__(self, blob_dir, blob_hash) super().__init__(blob_dir, blob_hash)
self.callback = defer.Deferred() self.callback = defer.Deferred()
reactor.callLater(timeout, self._cancel) reactor.callLater(timeout, self._cancel)
@ -43,7 +43,7 @@ class BlobCallback(BlobFile):
return result return result
class SingleBlobDownloadManager(object): class SingleBlobDownloadManager:
def __init__(self, blob): def __init__(self, blob):
self.blob = blob self.blob = blob
@ -57,7 +57,7 @@ class SingleBlobDownloadManager(object):
return self.blob.blob_hash return self.blob.blob_hash
class SinglePeerDownloader(object): class SinglePeerDownloader:
def __init__(self): def __init__(self):
self._payment_rate_manager = OnlyFreePaymentsManager() self._payment_rate_manager = OnlyFreePaymentsManager()
self._rate_limiter = DummyRateLimiter() self._rate_limiter = DummyRateLimiter()

View file

@ -10,7 +10,7 @@ def get_default_strategy(blob_tracker, **kwargs):
return BasicAvailabilityWeightedStrategy(blob_tracker, **kwargs) return BasicAvailabilityWeightedStrategy(blob_tracker, **kwargs)
class Strategy(object): class Strategy:
""" """
Base for negotiation strategies Base for negotiation strategies
""" """
@ -109,7 +109,7 @@ class BasicAvailabilityWeightedStrategy(Strategy):
base_price=0.0001, alpha=1.0): base_price=0.0001, alpha=1.0):
price_model = MeanAvailabilityWeightedPrice( price_model = MeanAvailabilityWeightedPrice(
blob_tracker, base_price=base_price, alpha=alpha) blob_tracker, base_price=base_price, alpha=alpha)
Strategy.__init__(self, price_model, max_rate, min_rate, is_generous) super().__init__(price_model, max_rate, min_rate, is_generous)
self._acceleration = Decimal(acceleration) # rate of how quickly to ramp offer self._acceleration = Decimal(acceleration) # rate of how quickly to ramp offer
self._deceleration = Decimal(deceleration) self._deceleration = Decimal(deceleration)
@ -140,7 +140,7 @@ class OnlyFreeStrategy(Strategy):
implementer(INegotiationStrategy) implementer(INegotiationStrategy)
def __init__(self, *args, **kwargs): def __init__(self, *args, **kwargs):
price_model = ZeroPrice() price_model = ZeroPrice()
Strategy.__init__(self, price_model, 0.0, 0.0, True) super().__init__(price_model, 0.0, 0.0, True)
def _get_mean_rate(self, rates): def _get_mean_rate(self, rates):
return 0.0 return 0.0

View file

@ -1,4 +1,5 @@
import binascii from binascii import unhexlify
import string
from collections import defaultdict from collections import defaultdict
import json import json
import logging import logging
@ -12,7 +13,14 @@ from lbrynet.core.HTTPBlobDownloader import HTTPBlobDownloader
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class StreamDescriptorReader(object): class JSONBytesEncoder(json.JSONEncoder):
def default(self, obj): # pylint: disable=E0202
if isinstance(obj, bytes):
return obj.decode()
return super().default(obj)
class StreamDescriptorReader:
"""Classes which derive from this class read a stream descriptor file return """Classes which derive from this class read a stream descriptor file return
a dictionary containing the fields in the file""" a dictionary containing the fields in the file"""
def __init__(self): def __init__(self):
@ -33,7 +41,7 @@ class StreamDescriptorReader(object):
class PlainStreamDescriptorReader(StreamDescriptorReader): class PlainStreamDescriptorReader(StreamDescriptorReader):
"""Read a stream descriptor file which is not a blob but a regular file""" """Read a stream descriptor file which is not a blob but a regular file"""
def __init__(self, stream_descriptor_filename): def __init__(self, stream_descriptor_filename):
StreamDescriptorReader.__init__(self) super().__init__()
self.stream_descriptor_filename = stream_descriptor_filename self.stream_descriptor_filename = stream_descriptor_filename
def _get_raw_data(self): def _get_raw_data(self):
@ -49,7 +57,7 @@ class PlainStreamDescriptorReader(StreamDescriptorReader):
class BlobStreamDescriptorReader(StreamDescriptorReader): class BlobStreamDescriptorReader(StreamDescriptorReader):
"""Read a stream descriptor file which is a blob""" """Read a stream descriptor file which is a blob"""
def __init__(self, blob): def __init__(self, blob):
StreamDescriptorReader.__init__(self) super().__init__()
self.blob = blob self.blob = blob
def _get_raw_data(self): def _get_raw_data(self):
@ -66,14 +74,16 @@ class BlobStreamDescriptorReader(StreamDescriptorReader):
return threads.deferToThread(get_data) return threads.deferToThread(get_data)
class StreamDescriptorWriter(object): class StreamDescriptorWriter:
"""Classes which derive from this class write fields from a dictionary """Classes which derive from this class write fields from a dictionary
of fields to a stream descriptor""" of fields to a stream descriptor"""
def __init__(self): def __init__(self):
pass pass
def create_descriptor(self, sd_info): def create_descriptor(self, sd_info):
return self._write_stream_descriptor(json.dumps(sd_info)) return self._write_stream_descriptor(
json.dumps(sd_info, sort_keys=True).encode()
)
def _write_stream_descriptor(self, raw_data): def _write_stream_descriptor(self, raw_data):
"""This method must be overridden by subclasses to write raw data to """This method must be overridden by subclasses to write raw data to
@ -84,7 +94,7 @@ class StreamDescriptorWriter(object):
class PlainStreamDescriptorWriter(StreamDescriptorWriter): class PlainStreamDescriptorWriter(StreamDescriptorWriter):
def __init__(self, sd_file_name): def __init__(self, sd_file_name):
StreamDescriptorWriter.__init__(self) super().__init__()
self.sd_file_name = sd_file_name self.sd_file_name = sd_file_name
def _write_stream_descriptor(self, raw_data): def _write_stream_descriptor(self, raw_data):
@ -100,7 +110,7 @@ class PlainStreamDescriptorWriter(StreamDescriptorWriter):
class BlobStreamDescriptorWriter(StreamDescriptorWriter): class BlobStreamDescriptorWriter(StreamDescriptorWriter):
def __init__(self, blob_manager): def __init__(self, blob_manager):
StreamDescriptorWriter.__init__(self) super().__init__()
self.blob_manager = blob_manager self.blob_manager = blob_manager
@defer.inlineCallbacks @defer.inlineCallbacks
@ -114,7 +124,7 @@ class BlobStreamDescriptorWriter(StreamDescriptorWriter):
defer.returnValue(sd_hash) defer.returnValue(sd_hash)
class StreamMetadata(object): class StreamMetadata:
FROM_BLOB = 1 FROM_BLOB = 1
FROM_PLAIN = 2 FROM_PLAIN = 2
@ -127,7 +137,7 @@ class StreamMetadata(object):
self.source_file = None self.source_file = None
class StreamDescriptorIdentifier(object): class StreamDescriptorIdentifier:
"""Tries to determine the type of stream described by the stream descriptor using the """Tries to determine the type of stream described by the stream descriptor using the
'stream_type' field. Keeps a list of StreamDescriptorValidators and StreamDownloaderFactorys 'stream_type' field. Keeps a list of StreamDescriptorValidators and StreamDownloaderFactorys
and returns the appropriate ones based on the type of the stream descriptor given and returns the appropriate ones based on the type of the stream descriptor given
@ -254,7 +264,7 @@ def save_sd_info(blob_manager, sd_hash, sd_info):
(sd_hash, calculated_sd_hash)) (sd_hash, calculated_sd_hash))
stream_hash = yield blob_manager.storage.get_stream_hash_for_sd_hash(sd_hash) stream_hash = yield blob_manager.storage.get_stream_hash_for_sd_hash(sd_hash)
if not stream_hash: if not stream_hash:
log.debug("Saving info for %s", sd_info['stream_name'].decode('hex')) log.debug("Saving info for %s", unhexlify(sd_info['stream_name']))
stream_name = sd_info['stream_name'] stream_name = sd_info['stream_name']
key = sd_info['key'] key = sd_info['key']
stream_hash = sd_info['stream_hash'] stream_hash = sd_info['stream_hash']
@ -272,9 +282,9 @@ def format_blobs(crypt_blob_infos):
for blob_info in crypt_blob_infos: for blob_info in crypt_blob_infos:
blob = {} blob = {}
if blob_info.length != 0: if blob_info.length != 0:
blob['blob_hash'] = str(blob_info.blob_hash) blob['blob_hash'] = blob_info.blob_hash
blob['blob_num'] = blob_info.blob_num blob['blob_num'] = blob_info.blob_num
blob['iv'] = str(blob_info.iv) blob['iv'] = blob_info.iv
blob['length'] = blob_info.length blob['length'] = blob_info.length
formatted_blobs.append(blob) formatted_blobs.append(blob)
return formatted_blobs return formatted_blobs
@ -344,18 +354,18 @@ def get_blob_hashsum(b):
iv = b['iv'] iv = b['iv']
blob_hashsum = get_lbry_hash_obj() blob_hashsum = get_lbry_hash_obj()
if length != 0: if length != 0:
blob_hashsum.update(blob_hash) blob_hashsum.update(blob_hash.encode())
blob_hashsum.update(str(blob_num)) blob_hashsum.update(str(blob_num).encode())
blob_hashsum.update(iv) blob_hashsum.update(iv.encode())
blob_hashsum.update(str(length)) blob_hashsum.update(str(length).encode())
return blob_hashsum.digest() return blob_hashsum.digest()
def get_stream_hash(hex_stream_name, key, hex_suggested_file_name, blob_infos): def get_stream_hash(hex_stream_name, key, hex_suggested_file_name, blob_infos):
h = get_lbry_hash_obj() h = get_lbry_hash_obj()
h.update(hex_stream_name) h.update(hex_stream_name.encode())
h.update(key) h.update(key.encode())
h.update(hex_suggested_file_name) h.update(hex_suggested_file_name.encode())
blobs_hashsum = get_lbry_hash_obj() blobs_hashsum = get_lbry_hash_obj()
for blob in blob_infos: for blob in blob_infos:
blobs_hashsum.update(get_blob_hashsum(blob)) blobs_hashsum.update(get_blob_hashsum(blob))
@ -364,9 +374,8 @@ def get_stream_hash(hex_stream_name, key, hex_suggested_file_name, blob_infos):
def verify_hex(text, field_name): def verify_hex(text, field_name):
for c in text: if not set(text).issubset(set(string.hexdigits)):
if c not in '0123456789abcdef': raise InvalidStreamDescriptorError("%s is not a hex-encoded string" % field_name)
raise InvalidStreamDescriptorError("%s is not a hex-encoded string" % field_name)
def validate_descriptor(stream_info): def validate_descriptor(stream_info):
@ -397,7 +406,7 @@ def validate_descriptor(stream_info):
return True return True
class EncryptedFileStreamDescriptorValidator(object): class EncryptedFileStreamDescriptorValidator:
def __init__(self, raw_info): def __init__(self, raw_info):
self.raw_info = raw_info self.raw_info = raw_info
@ -406,14 +415,14 @@ class EncryptedFileStreamDescriptorValidator(object):
def info_to_show(self): def info_to_show(self):
info = [] info = []
info.append(("stream_name", binascii.unhexlify(self.raw_info.get("stream_name")))) info.append(("stream_name", unhexlify(self.raw_info.get("stream_name"))))
size_so_far = 0 size_so_far = 0
for blob_info in self.raw_info.get("blobs", []): for blob_info in self.raw_info.get("blobs", []):
size_so_far += int(blob_info['length']) size_so_far += int(blob_info['length'])
info.append(("stream_size", str(self.get_length_of_stream()))) info.append(("stream_size", str(self.get_length_of_stream())))
suggested_file_name = self.raw_info.get("suggested_file_name", None) suggested_file_name = self.raw_info.get("suggested_file_name", None)
if suggested_file_name is not None: if suggested_file_name is not None:
suggested_file_name = binascii.unhexlify(suggested_file_name) suggested_file_name = unhexlify(suggested_file_name)
info.append(("suggested_file_name", suggested_file_name)) info.append(("suggested_file_name", suggested_file_name))
return info return info

File diff suppressed because it is too large Load diff

View file

@ -8,7 +8,7 @@ DELAY_INCREMENT = 0.0001
QUEUE_SIZE_THRESHOLD = 100 QUEUE_SIZE_THRESHOLD = 100
class CallLaterManager(object): class CallLaterManager:
def __init__(self, callLater): def __init__(self, callLater):
""" """
:param callLater: (IReactorTime.callLater) :param callLater: (IReactorTime.callLater)

View file

@ -5,13 +5,11 @@ from decimal import Decimal
from twisted.internet import defer from twisted.internet import defer
from twisted.python.failure import Failure from twisted.python.failure import Failure
from twisted.internet.error import ConnectionAborted from twisted.internet.error import ConnectionAborted
from zope.interface import implements
from lbrynet.core.Error import ConnectionClosedBeforeResponseError from lbrynet.core.Error import ConnectionClosedBeforeResponseError
from lbrynet.core.Error import InvalidResponseError, RequestCanceledError, NoResponseError from lbrynet.core.Error import InvalidResponseError, RequestCanceledError, NoResponseError
from lbrynet.core.Error import PriceDisagreementError, DownloadCanceledError, InsufficientFundsError from lbrynet.core.Error import PriceDisagreementError, DownloadCanceledError, InsufficientFundsError
from lbrynet.core.client.ClientRequest import ClientRequest, ClientBlobRequest from lbrynet.core.client.ClientRequest import ClientRequest, ClientBlobRequest
from lbrynet.interfaces import IRequestCreator
from lbrynet.core.Offer import Offer from lbrynet.core.Offer import Offer
@ -39,8 +37,8 @@ def cache(fn):
return helper return helper
class BlobRequester(object): class BlobRequester:
implements(IRequestCreator) #implements(IRequestCreator)
def __init__(self, blob_manager, peer_finder, payment_rate_manager, wallet, download_manager): def __init__(self, blob_manager, peer_finder, payment_rate_manager, wallet, download_manager):
self.blob_manager = blob_manager self.blob_manager = blob_manager
@ -163,7 +161,7 @@ class BlobRequester(object):
return True return True
def _get_bad_peers(self): def _get_bad_peers(self):
return [p for p in self._peers.iterkeys() if not self._should_send_request_to(p)] return [p for p in self._peers.keys() if not self._should_send_request_to(p)]
def _hash_available(self, blob_hash): def _hash_available(self, blob_hash):
for peer in self._available_blobs: for peer in self._available_blobs:
@ -195,7 +193,7 @@ class BlobRequester(object):
self._peers[peer] += amount self._peers[peer] += amount
class RequestHelper(object): class RequestHelper:
def __init__(self, requestor, peer, protocol, payment_rate_manager): def __init__(self, requestor, peer, protocol, payment_rate_manager):
self.requestor = requestor self.requestor = requestor
self.peer = peer self.peer = peer
@ -429,7 +427,7 @@ class PriceRequest(RequestHelper):
class DownloadRequest(RequestHelper): class DownloadRequest(RequestHelper):
"""Choose a blob and download it from a peer and also pay the peer for the data.""" """Choose a blob and download it from a peer and also pay the peer for the data."""
def __init__(self, requester, peer, protocol, payment_rate_manager, wallet, head_blob_hash): def __init__(self, requester, peer, protocol, payment_rate_manager, wallet, head_blob_hash):
RequestHelper.__init__(self, requester, peer, protocol, payment_rate_manager) super().__init__(requester, peer, protocol, payment_rate_manager)
self.wallet = wallet self.wallet = wallet
self.head_blob_hash = head_blob_hash self.head_blob_hash = head_blob_hash
@ -578,7 +576,7 @@ class DownloadRequest(RequestHelper):
return reason return reason
class BlobDownloadDetails(object): class BlobDownloadDetails:
"""Contains the information needed to make a ClientBlobRequest from an open blob""" """Contains the information needed to make a ClientBlobRequest from an open blob"""
def __init__(self, blob, deferred, write_func, cancel_func, peer): def __init__(self, blob, deferred, write_func, cancel_func, peer):
self.blob = blob self.blob = blob

View file

@ -10,8 +10,6 @@ from lbrynet.core import utils
from lbrynet.core.Error import ConnectionClosedBeforeResponseError, NoResponseError from lbrynet.core.Error import ConnectionClosedBeforeResponseError, NoResponseError
from lbrynet.core.Error import DownloadCanceledError, MisbehavingPeerError from lbrynet.core.Error import DownloadCanceledError, MisbehavingPeerError
from lbrynet.core.Error import RequestCanceledError from lbrynet.core.Error import RequestCanceledError
from lbrynet.interfaces import IRequestSender, IRateLimited
from zope.interface import implements
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -24,7 +22,7 @@ def encode_decimal(obj):
class ClientProtocol(Protocol, TimeoutMixin): class ClientProtocol(Protocol, TimeoutMixin):
implements(IRequestSender, IRateLimited) #implements(IRequestSender, IRateLimited)
######### Protocol ######### ######### Protocol #########
PROTOCOL_TIMEOUT = 30 PROTOCOL_TIMEOUT = 30
@ -34,7 +32,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
self._rate_limiter = self.factory.rate_limiter self._rate_limiter = self.factory.rate_limiter
self.peer = self.factory.peer self.peer = self.factory.peer
self._response_deferreds = {} self._response_deferreds = {}
self._response_buff = '' self._response_buff = b''
self._downloading_blob = False self._downloading_blob = False
self._blob_download_request = None self._blob_download_request = None
self._next_request = {} self._next_request = {}
@ -61,7 +59,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
self.transport.loseConnection() self.transport.loseConnection()
response, extra_data = self._get_valid_response(self._response_buff) response, extra_data = self._get_valid_response(self._response_buff)
if response is not None: if response is not None:
self._response_buff = '' self._response_buff = b''
self._handle_response(response) self._handle_response(response)
if self._downloading_blob is True and len(extra_data) != 0: if self._downloading_blob is True and len(extra_data) != 0:
self._blob_download_request.write(extra_data) self._blob_download_request.write(extra_data)
@ -71,17 +69,17 @@ class ClientProtocol(Protocol, TimeoutMixin):
self.peer.report_down() self.peer.report_down()
self.transport.abortConnection() self.transport.abortConnection()
def connectionLost(self, reason): def connectionLost(self, reason=None):
log.debug("Connection lost to %s: %s", self.peer, reason) log.debug("Connection lost to %s: %s", self.peer, reason)
self.setTimeout(None) self.setTimeout(None)
self.connection_closed = True self.connection_closed = True
if reason.check(error.ConnectionDone): if reason is None or reason.check(error.ConnectionDone):
err = failure.Failure(ConnectionClosedBeforeResponseError()) err = failure.Failure(ConnectionClosedBeforeResponseError())
else: else:
err = reason err = reason
for key, d in self._response_deferreds.items(): for key, d in self._response_deferreds.items():
del self._response_deferreds[key]
d.errback(err) d.errback(err)
self._response_deferreds.clear()
if self._blob_download_request is not None: if self._blob_download_request is not None:
self._blob_download_request.cancel(err) self._blob_download_request.cancel(err)
self.factory.connection_was_made_deferred.callback(True) self.factory.connection_was_made_deferred.callback(True)
@ -111,7 +109,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
self.connection_closing = True self.connection_closing = True
ds = [] ds = []
err = RequestCanceledError() err = RequestCanceledError()
for key, d in self._response_deferreds.items(): for key, d in list(self._response_deferreds.items()):
del self._response_deferreds[key] del self._response_deferreds[key]
d.errback(err) d.errback(err)
ds.append(d) ds.append(d)
@ -126,7 +124,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
def _handle_request_error(self, err): def _handle_request_error(self, err):
log.error("An unexpected error occurred creating or sending a request to %s. %s: %s", log.error("An unexpected error occurred creating or sending a request to %s. %s: %s",
self.peer, err.type, err.message) self.peer, err.type, err)
self.transport.loseConnection() self.transport.loseConnection()
def _ask_for_request(self): def _ask_for_request(self):
@ -151,7 +149,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
self.setTimeout(self.PROTOCOL_TIMEOUT) self.setTimeout(self.PROTOCOL_TIMEOUT)
# TODO: compare this message to the last one. If they're the same, # TODO: compare this message to the last one. If they're the same,
# TODO: incrementally delay this message. # TODO: incrementally delay this message.
m = json.dumps(request_msg, default=encode_decimal) m = json.dumps(request_msg, default=encode_decimal).encode()
self.transport.write(m) self.transport.write(m)
def _get_valid_response(self, response_msg): def _get_valid_response(self, response_msg):
@ -159,7 +157,7 @@ class ClientProtocol(Protocol, TimeoutMixin):
response = None response = None
curr_pos = 0 curr_pos = 0
while 1: while 1:
next_close_paren = response_msg.find('}', curr_pos) next_close_paren = response_msg.find(b'}', curr_pos)
if next_close_paren != -1: if next_close_paren != -1:
curr_pos = next_close_paren + 1 curr_pos = next_close_paren + 1
try: try:

View file

@ -1,7 +1,7 @@
from lbrynet.blob.blob_file import MAX_BLOB_SIZE from lbrynet.blob.blob_file import MAX_BLOB_SIZE
class ClientRequest(object): class ClientRequest:
def __init__(self, request_dict, response_identifier=None): def __init__(self, request_dict, response_identifier=None):
self.request_dict = request_dict self.request_dict = request_dict
self.response_identifier = response_identifier self.response_identifier = response_identifier
@ -9,7 +9,7 @@ class ClientRequest(object):
class ClientPaidRequest(ClientRequest): class ClientPaidRequest(ClientRequest):
def __init__(self, request_dict, response_identifier, max_pay_units): def __init__(self, request_dict, response_identifier, max_pay_units):
ClientRequest.__init__(self, request_dict, response_identifier) super().__init__(request_dict, response_identifier)
self.max_pay_units = max_pay_units self.max_pay_units = max_pay_units
@ -20,7 +20,7 @@ class ClientBlobRequest(ClientPaidRequest):
max_pay_units = MAX_BLOB_SIZE max_pay_units = MAX_BLOB_SIZE
else: else:
max_pay_units = blob.length max_pay_units = blob.length
ClientPaidRequest.__init__(self, request_dict, response_identifier, max_pay_units) super().__init__(request_dict, response_identifier, max_pay_units)
self.write = write_func self.write = write_func
self.finished_deferred = finished_deferred self.finished_deferred = finished_deferred
self.cancel = cancel_func self.cancel = cancel_func

View file

@ -1,8 +1,6 @@
import random import random
import logging import logging
from twisted.internet import defer, reactor from twisted.internet import defer, reactor
from zope.interface import implements
from lbrynet import interfaces
from lbrynet import conf from lbrynet import conf
from lbrynet.core.client.ClientProtocol import ClientProtocolFactory from lbrynet.core.client.ClientProtocol import ClientProtocolFactory
from lbrynet.core.Error import InsufficientFundsError from lbrynet.core.Error import InsufficientFundsError
@ -11,15 +9,15 @@ from lbrynet.core import utils
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class PeerConnectionHandler(object): class PeerConnectionHandler:
def __init__(self, request_creators, factory): def __init__(self, request_creators, factory):
self.request_creators = request_creators self.request_creators = request_creators
self.factory = factory self.factory = factory
self.connection = None self.connection = None
class ConnectionManager(object): class ConnectionManager:
implements(interfaces.IConnectionManager) #implements(interfaces.IConnectionManager)
MANAGE_CALL_INTERVAL_SEC = 5 MANAGE_CALL_INTERVAL_SEC = 5
TCP_CONNECT_TIMEOUT = 15 TCP_CONNECT_TIMEOUT = 15
@ -98,7 +96,8 @@ class ConnectionManager(object):
d.addBoth(lambda _: disconnect_peer(p)) d.addBoth(lambda _: disconnect_peer(p))
return d return d
closing_deferreds = [close_connection(peer) for peer in self._peer_connections.keys()] # fixme: stop modifying dict during iteration
closing_deferreds = [close_connection(peer) for peer in list(self._peer_connections)]
return defer.DeferredList(closing_deferreds) return defer.DeferredList(closing_deferreds)
@defer.inlineCallbacks @defer.inlineCallbacks
@ -226,5 +225,3 @@ class ConnectionManager(object):
del self._connections_closing[peer] del self._connections_closing[peer]
d.callback(True) d.callback(True)
return connection_was_made return connection_was_made

View file

@ -1,14 +1,12 @@
import logging import logging
from twisted.internet import defer from twisted.internet import defer
from zope.interface import implements
from lbrynet import interfaces
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class DownloadManager(object): class DownloadManager:
implements(interfaces.IDownloadManager) #implements(interfaces.IDownloadManager)
def __init__(self, blob_manager): def __init__(self, blob_manager):
self.blob_manager = blob_manager self.blob_manager = blob_manager
@ -81,14 +79,14 @@ class DownloadManager(object):
return self.blob_handler.handle_blob(self.blobs[blob_num], self.blob_infos[blob_num]) return self.blob_handler.handle_blob(self.blobs[blob_num], self.blob_infos[blob_num])
def calculate_total_bytes(self): def calculate_total_bytes(self):
return sum([bi.length for bi in self.blob_infos.itervalues()]) return sum([bi.length for bi in self.blob_infos.values()])
def calculate_bytes_left_to_output(self): def calculate_bytes_left_to_output(self):
if not self.blobs: if not self.blobs:
return self.calculate_total_bytes() return self.calculate_total_bytes()
else: else:
to_be_outputted = [ to_be_outputted = [
b for n, b in self.blobs.iteritems() b for n, b in self.blobs.items()
if n >= self.progress_manager.last_blob_outputted if n >= self.progress_manager.last_blob_outputted
] ]
return sum([b.length for b in to_be_outputted if b.length is not None]) return sum([b.length for b in to_be_outputted if b.length is not None])

View file

@ -1,6 +1,4 @@
import logging import logging
from zope.interface import implements
from lbrynet import interfaces
from lbrynet.core.BlobInfo import BlobInfo from lbrynet.core.BlobInfo import BlobInfo
from lbrynet.core.client.BlobRequester import BlobRequester from lbrynet.core.client.BlobRequester import BlobRequester
from lbrynet.core.client.ConnectionManager import ConnectionManager from lbrynet.core.client.ConnectionManager import ConnectionManager
@ -14,8 +12,8 @@ from twisted.internet.task import LoopingCall
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class SingleBlobMetadataHandler(object): class SingleBlobMetadataHandler:
implements(interfaces.IMetadataHandler) #implements(interfaces.IMetadataHandler)
def __init__(self, blob_hash, download_manager): def __init__(self, blob_hash, download_manager):
self.blob_hash = blob_hash self.blob_hash = blob_hash
@ -31,7 +29,7 @@ class SingleBlobMetadataHandler(object):
return 0 return 0
class SingleProgressManager(object): class SingleProgressManager:
def __init__(self, download_manager, finished_callback, timeout_callback, timeout): def __init__(self, download_manager, finished_callback, timeout_callback, timeout):
self.finished_callback = finished_callback self.finished_callback = finished_callback
self.timeout_callback = timeout_callback self.timeout_callback = timeout_callback
@ -71,10 +69,10 @@ class SingleProgressManager(object):
def needed_blobs(self): def needed_blobs(self):
blobs = self.download_manager.blobs blobs = self.download_manager.blobs
assert len(blobs) == 1 assert len(blobs) == 1
return [b for b in blobs.itervalues() if not b.get_is_verified()] return [b for b in blobs.values() if not b.get_is_verified()]
class DummyBlobHandler(object): class DummyBlobHandler:
def __init__(self): def __init__(self):
pass pass
@ -82,7 +80,7 @@ class DummyBlobHandler(object):
pass pass
class StandaloneBlobDownloader(object): class StandaloneBlobDownloader:
def __init__(self, blob_hash, blob_manager, peer_finder, def __init__(self, blob_hash, blob_manager, peer_finder,
rate_limiter, payment_rate_manager, wallet, rate_limiter, payment_rate_manager, wallet,
timeout=None): timeout=None):

View file

@ -1,14 +1,12 @@
import logging import logging
from lbrynet.interfaces import IProgressManager
from twisted.internet import defer from twisted.internet import defer
from zope.interface import implements
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class StreamProgressManager(object): class StreamProgressManager:
implements(IProgressManager) #implements(IProgressManager)
def __init__(self, finished_callback, blob_manager, def __init__(self, finished_callback, blob_manager,
download_manager, delete_blob_after_finished=False): download_manager, delete_blob_after_finished=False):
@ -82,8 +80,8 @@ class StreamProgressManager(object):
class FullStreamProgressManager(StreamProgressManager): class FullStreamProgressManager(StreamProgressManager):
def __init__(self, finished_callback, blob_manager, def __init__(self, finished_callback, blob_manager,
download_manager, delete_blob_after_finished=False): download_manager, delete_blob_after_finished=False):
StreamProgressManager.__init__(self, finished_callback, blob_manager, download_manager, super().__init__(finished_callback, blob_manager, download_manager,
delete_blob_after_finished) delete_blob_after_finished)
self.outputting_d = None self.outputting_d = None
######### IProgressManager ######### ######### IProgressManager #########
@ -103,15 +101,15 @@ class FullStreamProgressManager(StreamProgressManager):
if not blobs: if not blobs:
return 0 return 0
else: else:
for i in xrange(max(blobs.iterkeys())): for i in range(max(blobs.keys())):
if self._done(i, blobs): if self._done(i, blobs):
return i return i
return max(blobs.iterkeys()) + 1 return max(blobs.keys()) + 1
def needed_blobs(self): def needed_blobs(self):
blobs = self.download_manager.blobs blobs = self.download_manager.blobs
return [ return [
b for n, b in blobs.iteritems() b for n, b in blobs.items()
if not b.get_is_verified() and not n in self.provided_blob_nums if not b.get_is_verified() and not n in self.provided_blob_nums
] ]

View file

@ -1,17 +0,0 @@
import os
from contextlib import contextmanager
@contextmanager
def get_read_handle(path):
"""
Get os independent read handle for a file
"""
if os.name == "nt":
file_mode = 'rb'
else:
file_mode = 'r'
read_handle = open(path, file_mode)
yield read_handle
read_handle.close()

View file

@ -14,7 +14,7 @@ from lbrynet.core import utils
class HTTPSHandler(logging.Handler): class HTTPSHandler(logging.Handler):
def __init__(self, url, fqdn=False, localname=None, facility=None, cookies=None): def __init__(self, url, fqdn=False, localname=None, facility=None, cookies=None):
logging.Handler.__init__(self) super().__init__()
self.url = url self.url = url
self.fqdn = fqdn self.fqdn = fqdn
self.localname = localname self.localname = localname
@ -243,7 +243,7 @@ def configure_twisted():
observer.start() observer.start()
class LoggerNameFilter(object): class LoggerNameFilter:
"""Filter a log record based on its name. """Filter a log record based on its name.
Allows all info level and higher records to pass thru. Allows all info level and higher records to pass thru.

View file

@ -1,4 +1,4 @@
class LoopingCallManager(object): class LoopingCallManager:
def __init__(self, calls=None): def __init__(self, calls=None):
self.calls = calls or {} self.calls = calls or {}
@ -15,6 +15,6 @@ class LoopingCallManager(object):
self.calls[name].stop() self.calls[name].stop()
def shutdown(self): def shutdown(self):
for lcall in self.calls.itervalues(): for lcall in self.calls.values():
if lcall.running: if lcall.running:
lcall.stop() lcall.stop()

View file

@ -1,14 +1,12 @@
import logging import logging
from twisted.internet import defer from twisted.internet import defer
from zope.interface import implements
from lbrynet.interfaces import IQueryHandlerFactory, IQueryHandler
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class BlobAvailabilityHandlerFactory(object): class BlobAvailabilityHandlerFactory:
implements(IQueryHandlerFactory) # implements(IQueryHandlerFactory)
def __init__(self, blob_manager): def __init__(self, blob_manager):
self.blob_manager = blob_manager self.blob_manager = blob_manager
@ -26,8 +24,8 @@ class BlobAvailabilityHandlerFactory(object):
return "Blob Availability - blobs that are available to be uploaded" return "Blob Availability - blobs that are available to be uploaded"
class BlobAvailabilityHandler(object): class BlobAvailabilityHandler:
implements(IQueryHandler) #implements(IQueryHandler)
def __init__(self, blob_manager): def __init__(self, blob_manager):
self.blob_manager = blob_manager self.blob_manager = blob_manager

View file

@ -3,17 +3,15 @@ import logging
from twisted.internet import defer from twisted.internet import defer
from twisted.protocols.basic import FileSender from twisted.protocols.basic import FileSender
from twisted.python.failure import Failure from twisted.python.failure import Failure
from zope.interface import implements
from lbrynet import analytics from lbrynet import analytics
from lbrynet.core.Offer import Offer from lbrynet.core.Offer import Offer
from lbrynet.interfaces import IQueryHandlerFactory, IQueryHandler, IBlobSender
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class BlobRequestHandlerFactory(object): class BlobRequestHandlerFactory:
implements(IQueryHandlerFactory) #implements(IQueryHandlerFactory)
def __init__(self, blob_manager, wallet, payment_rate_manager, analytics_manager): def __init__(self, blob_manager, wallet, payment_rate_manager, analytics_manager):
self.blob_manager = blob_manager self.blob_manager = blob_manager
@ -35,8 +33,8 @@ class BlobRequestHandlerFactory(object):
return "Blob Uploader - uploads blobs" return "Blob Uploader - uploads blobs"
class BlobRequestHandler(object): class BlobRequestHandler:
implements(IQueryHandler, IBlobSender) #implements(IQueryHandler, IBlobSender)
PAYMENT_RATE_QUERY = 'blob_data_payment_rate' PAYMENT_RATE_QUERY = 'blob_data_payment_rate'
BLOB_QUERY = 'requested_blob' BLOB_QUERY = 'requested_blob'
AVAILABILITY_QUERY = 'requested_blobs' AVAILABILITY_QUERY = 'requested_blobs'

View file

@ -1,8 +1,7 @@
import logging import logging
from twisted.internet import interfaces, error from twisted.internet import error
from twisted.internet.protocol import Protocol, ServerFactory from twisted.internet.protocol import Protocol, ServerFactory
from twisted.python import failure from twisted.python import failure
from zope.interface import implements
from lbrynet.core.server.ServerRequestHandler import ServerRequestHandler from lbrynet.core.server.ServerRequestHandler import ServerRequestHandler
@ -24,7 +23,7 @@ class ServerProtocol(Protocol):
10) Pause/resume production when told by the rate limiter 10) Pause/resume production when told by the rate limiter
""" """
implements(interfaces.IConsumer) #implements(interfaces.IConsumer)
#Protocol stuff #Protocol stuff

View file

@ -1,25 +1,23 @@
import json import json
import logging import logging
from twisted.internet import interfaces, defer from twisted.internet import defer
from zope.interface import implements
from lbrynet.interfaces import IRequestHandler
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class ServerRequestHandler(object): class ServerRequestHandler:
"""This class handles requests from clients. It can upload blobs and """This class handles requests from clients. It can upload blobs and
return request for information about more blobs that are return request for information about more blobs that are
associated with streams. associated with streams.
""" """
implements(interfaces.IPushProducer, interfaces.IConsumer, IRequestHandler) #implements(interfaces.IPushProducer, interfaces.IConsumer, IRequestHandler)
def __init__(self, consumer): def __init__(self, consumer):
self.consumer = consumer self.consumer = consumer
self.production_paused = False self.production_paused = False
self.request_buff = '' self.request_buff = b''
self.response_buff = '' self.response_buff = b''
self.producer = None self.producer = None
self.request_received = False self.request_received = False
self.CHUNK_SIZE = 2**14 self.CHUNK_SIZE = 2**14
@ -56,7 +54,7 @@ class ServerRequestHandler(object):
return return
chunk = self.response_buff[:self.CHUNK_SIZE] chunk = self.response_buff[:self.CHUNK_SIZE]
self.response_buff = self.response_buff[self.CHUNK_SIZE:] self.response_buff = self.response_buff[self.CHUNK_SIZE:]
if chunk == '': if chunk == b'':
return return
log.trace("writing %s bytes to the client", len(chunk)) log.trace("writing %s bytes to the client", len(chunk))
self.consumer.write(chunk) self.consumer.write(chunk)
@ -101,7 +99,7 @@ class ServerRequestHandler(object):
self.request_buff = self.request_buff + data self.request_buff = self.request_buff + data
msg = self.try_to_parse_request(self.request_buff) msg = self.try_to_parse_request(self.request_buff)
if msg: if msg:
self.request_buff = '' self.request_buff = b''
self._process_msg(msg) self._process_msg(msg)
else: else:
log.debug("Request buff not a valid json message") log.debug("Request buff not a valid json message")
@ -134,7 +132,7 @@ class ServerRequestHandler(object):
self._produce_more() self._produce_more()
def send_response(self, msg): def send_response(self, msg):
m = json.dumps(msg) m = json.dumps(msg).encode()
log.debug("Sending a response of length %s", str(len(m))) log.debug("Sending a response of length %s", str(len(m)))
log.debug("Response: %s", str(m)) log.debug("Response: %s", str(m))
self.response_buff = self.response_buff + m self.response_buff = self.response_buff + m
@ -167,7 +165,7 @@ class ServerRequestHandler(object):
return True return True
ds = [] ds = []
for query_handler, query_identifiers in self.query_handlers.iteritems(): for query_handler, query_identifiers in self.query_handlers.items():
queries = {q_i: msg[q_i] for q_i in query_identifiers if q_i in msg} queries = {q_i: msg[q_i] for q_i in query_identifiers if q_i in msg}
d = query_handler.handle_queries(queries) d = query_handler.handle_queries(queries)
d.addErrback(log_errors) d.addErrback(log_errors)

View file

@ -3,9 +3,9 @@ import json
import subprocess import subprocess
import os import os
from urllib2 import urlopen, URLError from six.moves.urllib import request
from six.moves.urllib.error import URLError
from lbryschema import __version__ as lbryschema_version from lbryschema import __version__ as lbryschema_version
from lbryum import __version__ as LBRYUM_VERSION
from lbrynet import build_type, __version__ as lbrynet_version from lbrynet import build_type, __version__ as lbrynet_version
from lbrynet.conf import ROOT_DIR from lbrynet.conf import ROOT_DIR
@ -18,9 +18,9 @@ def get_lbrynet_version():
return subprocess.check_output( return subprocess.check_output(
['git', '--git-dir='+git_dir, 'describe', '--dirty', '--always'], ['git', '--git-dir='+git_dir, 'describe', '--dirty', '--always'],
stderr=devnull stderr=devnull
).strip().lstrip('v') ).decode().strip().lstrip('v')
except (subprocess.CalledProcessError, OSError): except (subprocess.CalledProcessError, OSError):
print "failed to get version from git" print("failed to get version from git")
return lbrynet_version return lbrynet_version
@ -32,19 +32,21 @@ def get_platform(get_ip=True):
"os_release": platform.release(), "os_release": platform.release(),
"os_system": platform.system(), "os_system": platform.system(),
"lbrynet_version": get_lbrynet_version(), "lbrynet_version": get_lbrynet_version(),
"lbryum_version": LBRYUM_VERSION,
"lbryschema_version": lbryschema_version, "lbryschema_version": lbryschema_version,
"build": build_type.BUILD, # CI server sets this during build step "build": build_type.BUILD, # CI server sets this during build step
} }
if p["os_system"] == "Linux": if p["os_system"] == "Linux":
import distro try:
p["distro"] = distro.info() import distro
p["desktop"] = os.environ.get('XDG_CURRENT_DESKTOP', 'Unknown') p["distro"] = distro.info()
p["desktop"] = os.environ.get('XDG_CURRENT_DESKTOP', 'Unknown')
except ModuleNotFoundError:
pass
# TODO: remove this from get_platform and add a get_external_ip function using treq # TODO: remove this from get_platform and add a get_external_ip function using treq
if get_ip: if get_ip:
try: try:
response = json.loads(urlopen("https://api.lbry.io/ip").read()) response = json.loads(request.urlopen("https://api.lbry.io/ip").read())
if not response['success']: if not response['success']:
raise URLError("failed to get external ip") raise URLError("failed to get external ip")
p['ip'] = response['data']['ip'] p['ip'] = response['data']['ip']

View file

@ -1,4 +1,5 @@
import base64 import base64
import codecs
import datetime import datetime
import random import random
import socket import socket
@ -62,9 +63,9 @@ def safe_stop_looping_call(looping_call):
def generate_id(num=None): def generate_id(num=None):
h = get_lbry_hash_obj() h = get_lbry_hash_obj()
if num is not None: if num is not None:
h.update(str(num)) h.update(str(num).encode())
else: else:
h.update(str(random.getrandbits(512))) h.update(str(random.getrandbits(512)).encode())
return h.digest() return h.digest()
@ -88,15 +89,19 @@ def version_is_greater_than(a, b):
return pkg_resources.parse_version(a) > pkg_resources.parse_version(b) return pkg_resources.parse_version(a) > pkg_resources.parse_version(b)
def rot13(some_str):
return codecs.encode(some_str, 'rot_13')
def deobfuscate(obfustacated): def deobfuscate(obfustacated):
return base64.b64decode(obfustacated.decode('rot13')) return base64.b64decode(rot13(obfustacated))
def obfuscate(plain): def obfuscate(plain):
return base64.b64encode(plain).encode('rot13') return rot13(base64.b64encode(plain).decode())
def check_connection(server="lbry.io", port=80, timeout=2): def check_connection(server="lbry.io", port=80, timeout=5):
"""Attempts to open a socket to server:port and returns True if successful.""" """Attempts to open a socket to server:port and returns True if successful."""
log.debug('Checking connection to %s:%s', server, port) log.debug('Checking connection to %s:%s', server, port)
try: try:
@ -142,7 +147,7 @@ def get_sd_hash(stream_info):
get('source', {}).\ get('source', {}).\
get('source') get('source')
if not result: if not result:
log.warn("Unable to get sd_hash") log.warning("Unable to get sd_hash")
return result return result
@ -150,7 +155,7 @@ def json_dumps_pretty(obj, **kwargs):
return json.dumps(obj, sort_keys=True, indent=2, separators=(',', ': '), **kwargs) return json.dumps(obj, sort_keys=True, indent=2, separators=(',', ': '), **kwargs)
class DeferredLockContextManager(object): class DeferredLockContextManager:
def __init__(self, lock): def __init__(self, lock):
self._lock = lock self._lock = lock
@ -166,7 +171,7 @@ def DeferredDict(d, consumeErrors=False):
keys = [] keys = []
dl = [] dl = []
response = {} response = {}
for k, v in d.iteritems(): for k, v in d.items():
keys.append(k) keys.append(k)
dl.append(v) dl.append(v)
results = yield defer.DeferredList(dl, consumeErrors=consumeErrors) results = yield defer.DeferredList(dl, consumeErrors=consumeErrors)
@ -176,7 +181,7 @@ def DeferredDict(d, consumeErrors=False):
defer.returnValue(response) defer.returnValue(response)
class DeferredProfiler(object): class DeferredProfiler:
def __init__(self): def __init__(self):
self.profile_results = {} self.profile_results = {}

View file

@ -16,21 +16,21 @@ backend = default_backend()
class CryptBlobInfo(BlobInfo): class CryptBlobInfo(BlobInfo):
def __init__(self, blob_hash, blob_num, length, iv): def __init__(self, blob_hash, blob_num, length, iv):
BlobInfo.__init__(self, blob_hash, blob_num, length) super().__init__(blob_hash, blob_num, length)
self.iv = iv self.iv = iv
def get_dict(self): def get_dict(self):
info = { info = {
"blob_num": self.blob_num, "blob_num": self.blob_num,
"length": self.length, "length": self.length,
"iv": self.iv "iv": self.iv.decode()
} }
if self.blob_hash: if self.blob_hash:
info['blob_hash'] = self.blob_hash info['blob_hash'] = self.blob_hash
return info return info
class StreamBlobDecryptor(object): class StreamBlobDecryptor:
def __init__(self, blob, key, iv, length): def __init__(self, blob, key, iv, length):
""" """
This class decrypts blob This class decrypts blob
@ -68,14 +68,14 @@ class StreamBlobDecryptor(object):
def write_bytes(): def write_bytes():
if self.len_read < self.length: if self.len_read < self.length:
num_bytes_to_decrypt = greatest_multiple(len(self.buff), (AES.block_size / 8)) num_bytes_to_decrypt = greatest_multiple(len(self.buff), (AES.block_size // 8))
data_to_decrypt, self.buff = split(self.buff, num_bytes_to_decrypt) data_to_decrypt, self.buff = split(self.buff, num_bytes_to_decrypt)
write_func(self.cipher.update(data_to_decrypt)) write_func(self.cipher.update(data_to_decrypt))
def finish_decrypt(): def finish_decrypt():
bytes_left = len(self.buff) % (AES.block_size / 8) bytes_left = len(self.buff) % (AES.block_size // 8)
if bytes_left != 0: if bytes_left != 0:
log.warning(self.buff[-1 * (AES.block_size / 8):].encode('hex')) log.warning(self.buff[-1 * (AES.block_size // 8):].encode('hex'))
raise Exception("blob %s has incorrect padding: %i bytes left" % raise Exception("blob %s has incorrect padding: %i bytes left" %
(self.blob.blob_hash, bytes_left)) (self.blob.blob_hash, bytes_left))
data_to_decrypt, self.buff = self.buff, b'' data_to_decrypt, self.buff = self.buff, b''
@ -99,7 +99,7 @@ class StreamBlobDecryptor(object):
return d return d
class CryptStreamBlobMaker(object): class CryptStreamBlobMaker:
def __init__(self, key, iv, blob_num, blob): def __init__(self, key, iv, blob_num, blob):
""" """
This class encrypts data and writes it to a new blob This class encrypts data and writes it to a new blob
@ -146,7 +146,7 @@ class CryptStreamBlobMaker(object):
def close(self): def close(self):
log.debug("closing blob %s with plaintext len %s", str(self.blob_num), str(self.length)) log.debug("closing blob %s with plaintext len %s", str(self.blob_num), str(self.length))
if self.length != 0: if self.length != 0:
self.length += (AES.block_size / 8) - (self.length % (AES.block_size / 8)) self.length += (AES.block_size // 8) - (self.length % (AES.block_size // 8))
padded_data = self.padder.finalize() padded_data = self.padder.finalize()
encrypted_data = self.cipher.update(padded_data) + self.cipher.finalize() encrypted_data = self.cipher.update(padded_data) + self.cipher.finalize()
self.blob.write(encrypted_data) self.blob.write(encrypted_data)

View file

@ -5,15 +5,14 @@ import os
import logging import logging
from cryptography.hazmat.primitives.ciphers.algorithms import AES from cryptography.hazmat.primitives.ciphers.algorithms import AES
from twisted.internet import interfaces, defer from twisted.internet import defer
from zope.interface import implements
from lbrynet.cryptstream.CryptBlob import CryptStreamBlobMaker from lbrynet.cryptstream.CryptBlob import CryptStreamBlobMaker
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class CryptStreamCreator(object): class CryptStreamCreator:
""" """
Create a new stream with blobs encrypted by a symmetric cipher. Create a new stream with blobs encrypted by a symmetric cipher.
@ -22,7 +21,7 @@ class CryptStreamCreator(object):
the blob is associated with the stream. the blob is associated with the stream.
""" """
implements(interfaces.IConsumer) #implements(interfaces.IConsumer)
def __init__(self, blob_manager, name=None, key=None, iv_generator=None): def __init__(self, blob_manager, name=None, key=None, iv_generator=None):
"""@param blob_manager: Object that stores and provides access to blobs. """@param blob_manager: Object that stores and provides access to blobs.
@ -101,13 +100,13 @@ class CryptStreamCreator(object):
@staticmethod @staticmethod
def random_iv_generator(): def random_iv_generator():
while 1: while 1:
yield os.urandom(AES.block_size / 8) yield os.urandom(AES.block_size // 8)
def setup(self): def setup(self):
"""Create the symmetric key if it wasn't provided""" """Create the symmetric key if it wasn't provided"""
if self.key is None: if self.key is None:
self.key = os.urandom(AES.block_size / 8) self.key = os.urandom(AES.block_size // 8)
return defer.succeed(True) return defer.succeed(True)
@ -122,7 +121,7 @@ class CryptStreamCreator(object):
yield defer.DeferredList(self.finished_deferreds) yield defer.DeferredList(self.finished_deferreds)
self.blob_count += 1 self.blob_count += 1
iv = self.iv_generator.next() iv = next(self.iv_generator)
final_blob = self._get_blob_maker(iv, self.blob_manager.get_blob_creator()) final_blob = self._get_blob_maker(iv, self.blob_manager.get_blob_creator())
stream_terminator = yield final_blob.close() stream_terminator = yield final_blob.close()
terminator_info = yield self._blob_finished(stream_terminator) terminator_info = yield self._blob_finished(stream_terminator)
@ -133,7 +132,7 @@ class CryptStreamCreator(object):
if self.current_blob is None: if self.current_blob is None:
self.next_blob_creator = self.blob_manager.get_blob_creator() self.next_blob_creator = self.blob_manager.get_blob_creator()
self.blob_count += 1 self.blob_count += 1
iv = self.iv_generator.next() iv = next(self.iv_generator)
self.current_blob = self._get_blob_maker(iv, self.next_blob_creator) self.current_blob = self._get_blob_maker(iv, self.next_blob_creator)
done, num_bytes_written = self.current_blob.write(data) done, num_bytes_written = self.current_blob.write(data)
data = data[num_bytes_written:] data = data[num_bytes_written:]

View file

@ -1,12 +1,10 @@
import binascii import binascii
from zope.interface import implements
from twisted.internet import defer from twisted.internet import defer
from lbrynet.cryptstream.CryptBlob import StreamBlobDecryptor from lbrynet.cryptstream.CryptBlob import StreamBlobDecryptor
from lbrynet.interfaces import IBlobHandler
class CryptBlobHandler(object): class CryptBlobHandler:
implements(IBlobHandler) #implements(IBlobHandler)
def __init__(self, key, write_func): def __init__(self, key, write_func):
self.key = key self.key = key

View file

@ -1,7 +1,5 @@
import binascii from binascii import unhexlify
import logging import logging
from zope.interface import implements
from lbrynet.interfaces import IStreamDownloader
from lbrynet.core.client.BlobRequester import BlobRequester from lbrynet.core.client.BlobRequester import BlobRequester
from lbrynet.core.client.ConnectionManager import ConnectionManager from lbrynet.core.client.ConnectionManager import ConnectionManager
from lbrynet.core.client.DownloadManager import DownloadManager from lbrynet.core.client.DownloadManager import DownloadManager
@ -34,9 +32,9 @@ class CurrentlyStartingError(Exception):
pass pass
class CryptStreamDownloader(object): class CryptStreamDownloader:
implements(IStreamDownloader) #implements(IStreamDownloader)
def __init__(self, peer_finder, rate_limiter, blob_manager, payment_rate_manager, wallet, def __init__(self, peer_finder, rate_limiter, blob_manager, payment_rate_manager, wallet,
key, stream_name): key, stream_name):
@ -62,8 +60,8 @@ class CryptStreamDownloader(object):
self.blob_manager = blob_manager self.blob_manager = blob_manager
self.payment_rate_manager = payment_rate_manager self.payment_rate_manager = payment_rate_manager
self.wallet = wallet self.wallet = wallet
self.key = binascii.unhexlify(key) self.key = unhexlify(key)
self.stream_name = binascii.unhexlify(stream_name) self.stream_name = unhexlify(stream_name).decode()
self.completed = False self.completed = False
self.stopped = True self.stopped = True
self.stopping = False self.stopping = False

View file

@ -1,7 +1,7 @@
import logging import logging
from twisted.internet import defer from twisted.internet import defer
from twisted._threads import AlreadyQuit from twisted._threads import AlreadyQuit
from ComponentManager import ComponentManager from .ComponentManager import ComponentManager
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -14,7 +14,7 @@ class ComponentType(type):
return klass return klass
class Component(object): class Component(metaclass=ComponentType):
""" """
lbrynet-daemon component helper lbrynet-daemon component helper
@ -22,7 +22,6 @@ class Component(object):
methods methods
""" """
__metaclass__ = ComponentType
depends_on = [] depends_on = []
component_name = None component_name = None

View file

@ -6,7 +6,7 @@ from lbrynet.core.Error import ComponentStartConditionNotMet
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class RegisteredConditions(object): class RegisteredConditions:
conditions = {} conditions = {}
@ -20,7 +20,7 @@ class RequiredConditionType(type):
return klass return klass
class RequiredCondition(object): class RequiredCondition(metaclass=RequiredConditionType):
name = "" name = ""
component = "" component = ""
message = "" message = ""
@ -29,10 +29,8 @@ class RequiredCondition(object):
def evaluate(component): def evaluate(component):
raise NotImplementedError() raise NotImplementedError()
__metaclass__ = RequiredConditionType
class ComponentManager:
class ComponentManager(object):
default_component_classes = {} default_component_classes = {}
def __init__(self, reactor=None, analytics_manager=None, skip_components=None, **override_components): def __init__(self, reactor=None, analytics_manager=None, skip_components=None, **override_components):
@ -43,7 +41,7 @@ class ComponentManager(object):
self.components = set() self.components = set()
self.analytics_manager = analytics_manager self.analytics_manager = analytics_manager
for component_name, component_class in self.default_component_classes.iteritems(): for component_name, component_class in self.default_component_classes.items():
if component_name in override_components: if component_name in override_components:
component_class = override_components.pop(component_name) component_class = override_components.pop(component_name)
if component_name not in self.skip_components: if component_name not in self.skip_components:
@ -52,7 +50,7 @@ class ComponentManager(object):
if override_components: if override_components:
raise SyntaxError("unexpected components: %s" % override_components) raise SyntaxError("unexpected components: %s" % override_components)
for component_class in self.component_classes.itervalues(): for component_class in self.component_classes.values():
self.components.add(component_class(self)) self.components.add(component_class(self))
@defer.inlineCallbacks @defer.inlineCallbacks
@ -117,7 +115,7 @@ class ComponentManager(object):
:return: (defer.Deferred) :return: (defer.Deferred)
""" """
for component_name, cb in callbacks.iteritems(): for component_name, cb in callbacks.items():
if component_name not in self.component_classes: if component_name not in self.component_classes:
raise NameError("unknown component: %s" % component_name) raise NameError("unknown component: %s" % component_name)
if not callable(cb): if not callable(cb):
@ -132,7 +130,7 @@ class ComponentManager(object):
stages = self.sort_components() stages = self.sort_components()
for stage in stages: for stage in stages:
yield defer.DeferredList([_setup(component) for component in stage]) yield defer.DeferredList([_setup(component) for component in stage if not component.running])
@defer.inlineCallbacks @defer.inlineCallbacks
def stop(self): def stop(self):

View file

@ -1,20 +1,21 @@
import os import os
import logging import logging
from hashlib import sha256
import treq import treq
import math import math
import binascii import binascii
from hashlib import sha256
from types import SimpleNamespace
from twisted.internet import defer, threads, reactor, error from twisted.internet import defer, threads, reactor, error
import lbryschema
from txupnp.upnp import UPnP from txupnp.upnp import UPnP
from lbryum.simple_config import SimpleConfig
from lbryum.constants import HEADERS_URL, HEADER_SIZE
from lbrynet import conf from lbrynet import conf
from lbrynet.core.utils import DeferredDict from lbrynet.core.utils import DeferredDict
from lbrynet.core.PaymentRateManager import OnlyFreePaymentsManager from lbrynet.core.PaymentRateManager import OnlyFreePaymentsManager
from lbrynet.core.RateLimiter import RateLimiter from lbrynet.core.RateLimiter import RateLimiter
from lbrynet.core.BlobManager import DiskBlobManager from lbrynet.core.BlobManager import DiskBlobManager
from lbrynet.core.StreamDescriptor import StreamDescriptorIdentifier, EncryptedFileStreamType from lbrynet.core.StreamDescriptor import StreamDescriptorIdentifier, EncryptedFileStreamType
from lbrynet.core.Wallet import LBRYumWallet from lbrynet.wallet.manager import LbryWalletManager
from lbrynet.wallet.network import Network
from lbrynet.core.server.BlobRequestHandler import BlobRequestHandlerFactory from lbrynet.core.server.BlobRequestHandler import BlobRequestHandlerFactory
from lbrynet.core.server.ServerProtocol import ServerProtocolFactory from lbrynet.core.server.ServerProtocol import ServerProtocolFactory
from lbrynet.daemon.Component import Component from lbrynet.daemon.Component import Component
@ -25,7 +26,7 @@ from lbrynet.file_manager.EncryptedFileManager import EncryptedFileManager
from lbrynet.lbry_file.client.EncryptedFileDownloader import EncryptedFileSaverFactory from lbrynet.lbry_file.client.EncryptedFileDownloader import EncryptedFileSaverFactory
from lbrynet.lbry_file.client.EncryptedFileOptions import add_lbry_file_to_sd_identifier from lbrynet.lbry_file.client.EncryptedFileOptions import add_lbry_file_to_sd_identifier
from lbrynet.reflector import ServerFactory as reflector_server_factory from lbrynet.reflector import ServerFactory as reflector_server_factory
from lbrynet.txlbryum.factory import StratumClient
from lbrynet.core.utils import generate_id from lbrynet.core.utils import generate_id
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -68,7 +69,7 @@ def get_wallet_config():
return config return config
class ConfigSettings(object): class ConfigSettings:
@staticmethod @staticmethod
def get_conf_setting(setting_name): def get_conf_setting(setting_name):
return conf.settings[setting_name] return conf.settings[setting_name]
@ -101,7 +102,7 @@ class DatabaseComponent(Component):
component_name = DATABASE_COMPONENT component_name = DATABASE_COMPONENT
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.storage = None self.storage = None
@property @property
@ -169,12 +170,18 @@ class DatabaseComponent(Component):
self.storage = None self.storage = None
HEADERS_URL = "https://headers.lbry.io/blockchain_headers_latest"
HEADER_SIZE = 112
class HeadersComponent(Component): class HeadersComponent(Component):
component_name = HEADERS_COMPONENT component_name = HEADERS_COMPONENT
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.config = SimpleConfig(get_wallet_config()) self.headers_dir = os.path.join(conf.settings['lbryum_wallet_dir'], 'lbc_mainnet')
self.headers_file = os.path.join(self.headers_dir, 'headers')
self.old_file = os.path.join(conf.settings['lbryum_wallet_dir'], 'blockchain_headers')
self._downloading_headers = None self._downloading_headers = None
self._headers_progress_percent = None self._headers_progress_percent = None
@ -190,19 +197,18 @@ class HeadersComponent(Component):
@defer.inlineCallbacks @defer.inlineCallbacks
def fetch_headers_from_s3(self): def fetch_headers_from_s3(self):
local_header_size = self.local_header_file_size() def collector(data, h_file):
self._headers_progress_percent = 0.0
resume_header = {"Range": "bytes={}-".format(local_header_size)}
response = yield treq.get(HEADERS_URL, headers=resume_header)
final_size_after_download = response.length + local_header_size
def collector(data, h_file, start_size):
h_file.write(data) h_file.write(data)
local_size = float(h_file.tell()) local_size = float(h_file.tell())
final_size = float(final_size_after_download) final_size = float(final_size_after_download)
self._headers_progress_percent = math.ceil((local_size - start_size) / (final_size - start_size) * 100) self._headers_progress_percent = math.ceil(local_size / final_size * 100)
if response.code == 406: # our file is bigger local_header_size = self.local_header_file_size()
resume_header = {"Range": "bytes={}-".format(local_header_size)}
response = yield treq.get(HEADERS_URL, headers=resume_header)
got_406 = response.code == 406 # our file is bigger
final_size_after_download = response.length + local_header_size
if got_406:
log.warning("s3 is more out of date than we are") log.warning("s3 is more out of date than we are")
# should have something to download and a final length divisible by the header size # should have something to download and a final length divisible by the header size
elif final_size_after_download and not final_size_after_download % HEADER_SIZE: elif final_size_after_download and not final_size_after_download % HEADER_SIZE:
@ -211,11 +217,11 @@ class HeadersComponent(Component):
if s3_height > local_height: if s3_height > local_height:
if local_header_size: if local_header_size:
log.info("Resuming download of %i bytes from s3", response.length) log.info("Resuming download of %i bytes from s3", response.length)
with open(os.path.join(self.config.path, "blockchain_headers"), "a+b") as headers_file: with open(self.headers_file, "a+b") as headers_file:
yield treq.collect(response, lambda d: collector(d, headers_file, local_header_size)) yield treq.collect(response, lambda d: collector(d, headers_file))
else: else:
with open(os.path.join(self.config.path, "blockchain_headers"), "wb") as headers_file: with open(self.headers_file, "wb") as headers_file:
yield treq.collect(response, lambda d: collector(d, headers_file, 0)) yield treq.collect(response, lambda d: collector(d, headers_file))
log.info("fetched headers from s3 (s3 height: %i), now verifying integrity after download.", s3_height) log.info("fetched headers from s3 (s3 height: %i), now verifying integrity after download.", s3_height)
self._check_header_file_integrity() self._check_header_file_integrity()
else: else:
@ -227,20 +233,22 @@ class HeadersComponent(Component):
return max((self.local_header_file_size() / HEADER_SIZE) - 1, 0) return max((self.local_header_file_size() / HEADER_SIZE) - 1, 0)
def local_header_file_size(self): def local_header_file_size(self):
headers_path = os.path.join(self.config.path, "blockchain_headers") if os.path.isfile(self.headers_file):
if os.path.isfile(headers_path): return os.stat(self.headers_file).st_size
return os.stat(headers_path).st_size
return 0 return 0
@defer.inlineCallbacks @defer.inlineCallbacks
def get_remote_height(self, server, port): def get_remote_height(self):
connected = defer.Deferred() ledger = SimpleNamespace()
connected.addTimeout(3, reactor, lambda *_: None) ledger.config = {
client = StratumClient(connected) 'default_servers': conf.settings['lbryum_servers'],
reactor.connectTCP(server, port, client) 'data_path': conf.settings['lbryum_wallet_dir']
yield connected }
remote_height = yield client.blockchain_block_get_server_height() net = Network(ledger)
client.client.transport.loseConnection() net.start()
yield net.on_connected.first
remote_height = yield net.get_server_height()
yield net.stop()
defer.returnValue(remote_height) defer.returnValue(remote_height)
@defer.inlineCallbacks @defer.inlineCallbacks
@ -252,15 +260,10 @@ class HeadersComponent(Component):
if not s3_headers_depth: if not s3_headers_depth:
defer.returnValue(False) defer.returnValue(False)
local_height = self.local_header_file_height() local_height = self.local_header_file_height()
for server_url in self.config.get('default_servers'): remote_height = yield self.get_remote_height()
port = int(self.config.get('default_servers')[server_url]['t']) log.info("remote height: %i, local height: %s", remote_height, local_height)
try: if remote_height > (local_height + s3_headers_depth):
remote_height = yield self.get_remote_height(server_url, port) defer.returnValue(True)
log.info("%s:%i height: %i, local height: %s", server_url, port, remote_height, local_height)
if remote_height > (local_height + s3_headers_depth):
defer.returnValue(True)
except Exception as err:
log.warning("error requesting remote height from %s:%i - %s", server_url, port, err)
defer.returnValue(False) defer.returnValue(False)
def _check_header_file_integrity(self): def _check_header_file_integrity(self):
@ -272,22 +275,26 @@ class HeadersComponent(Component):
checksum_length_in_bytes = checksum_height * HEADER_SIZE checksum_length_in_bytes = checksum_height * HEADER_SIZE
if self.local_header_file_size() < checksum_length_in_bytes: if self.local_header_file_size() < checksum_length_in_bytes:
return return
headers_path = os.path.join(self.config.path, "blockchain_headers") with open(self.headers_file, "rb") as headers_file:
with open(headers_path, "rb") as headers_file:
hashsum.update(headers_file.read(checksum_length_in_bytes)) hashsum.update(headers_file.read(checksum_length_in_bytes))
current_checksum = hashsum.hexdigest() current_checksum = hashsum.hexdigest()
if current_checksum != checksum: if current_checksum != checksum:
msg = "Expected checksum {}, got {}".format(checksum, current_checksum) msg = "Expected checksum {}, got {}".format(checksum, current_checksum)
log.warning("Wallet file corrupted, checksum mismatch. " + msg) log.warning("Wallet file corrupted, checksum mismatch. " + msg)
log.warning("Deleting header file so it can be downloaded again.") log.warning("Deleting header file so it can be downloaded again.")
os.unlink(headers_path) os.unlink(self.headers_file)
elif (self.local_header_file_size() % HEADER_SIZE) != 0: elif (self.local_header_file_size() % HEADER_SIZE) != 0:
log.warning("Header file is good up to checkpoint height, but incomplete. Truncating to checkpoint.") log.warning("Header file is good up to checkpoint height, but incomplete. Truncating to checkpoint.")
with open(headers_path, "rb+") as headers_file: with open(self.headers_file, "rb+") as headers_file:
headers_file.truncate(checksum_length_in_bytes) headers_file.truncate(checksum_length_in_bytes)
@defer.inlineCallbacks @defer.inlineCallbacks
def start(self): def start(self):
if not os.path.exists(self.headers_dir):
os.mkdir(self.headers_dir)
if os.path.exists(self.old_file):
log.warning("Moving old headers from %s to %s.", self.old_file, self.headers_file)
os.rename(self.old_file, self.headers_file)
self._downloading_headers = yield self.should_download_headers_from_s3() self._downloading_headers = yield self.should_download_headers_from_s3()
if self._downloading_headers: if self._downloading_headers:
try: try:
@ -306,7 +313,7 @@ class WalletComponent(Component):
depends_on = [DATABASE_COMPONENT, HEADERS_COMPONENT] depends_on = [DATABASE_COMPONENT, HEADERS_COMPONENT]
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.wallet = None self.wallet = None
@property @property
@ -329,9 +336,11 @@ class WalletComponent(Component):
@defer.inlineCallbacks @defer.inlineCallbacks
def start(self): def start(self):
log.info("Starting torba wallet")
storage = self.component_manager.get_component(DATABASE_COMPONENT) storage = self.component_manager.get_component(DATABASE_COMPONENT)
config = get_wallet_config() lbryschema.BLOCKCHAIN_NAME = conf.settings['blockchain_name']
self.wallet = LBRYumWallet(storage, config) self.wallet = LbryWalletManager.from_lbrynet_config(conf.settings, storage)
self.wallet.old_db = storage
yield self.wallet.start() yield self.wallet.start()
@defer.inlineCallbacks @defer.inlineCallbacks
@ -345,7 +354,7 @@ class BlobComponent(Component):
depends_on = [DATABASE_COMPONENT, DHT_COMPONENT] depends_on = [DATABASE_COMPONENT, DHT_COMPONENT]
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.blob_manager = None self.blob_manager = None
@property @property
@ -376,7 +385,7 @@ class DHTComponent(Component):
depends_on = [UPNP_COMPONENT] depends_on = [UPNP_COMPONENT]
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.dht_node = None self.dht_node = None
self.upnp_component = None self.upnp_component = None
self.external_udp_port = None self.external_udp_port = None
@ -426,7 +435,7 @@ class HashAnnouncerComponent(Component):
depends_on = [DHT_COMPONENT, DATABASE_COMPONENT] depends_on = [DHT_COMPONENT, DATABASE_COMPONENT]
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.hash_announcer = None self.hash_announcer = None
@property @property
@ -454,7 +463,7 @@ class RateLimiterComponent(Component):
component_name = RATE_LIMITER_COMPONENT component_name = RATE_LIMITER_COMPONENT
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.rate_limiter = RateLimiter() self.rate_limiter = RateLimiter()
@property @property
@ -475,7 +484,7 @@ class StreamIdentifierComponent(Component):
depends_on = [DHT_COMPONENT, RATE_LIMITER_COMPONENT, BLOB_COMPONENT, DATABASE_COMPONENT, WALLET_COMPONENT] depends_on = [DHT_COMPONENT, RATE_LIMITER_COMPONENT, BLOB_COMPONENT, DATABASE_COMPONENT, WALLET_COMPONENT]
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.sd_identifier = StreamDescriptorIdentifier() self.sd_identifier = StreamDescriptorIdentifier()
@property @property
@ -509,7 +518,7 @@ class PaymentRateComponent(Component):
component_name = PAYMENT_RATE_COMPONENT component_name = PAYMENT_RATE_COMPONENT
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.payment_rate_manager = OnlyFreePaymentsManager() self.payment_rate_manager = OnlyFreePaymentsManager()
@property @property
@ -529,7 +538,7 @@ class FileManagerComponent(Component):
STREAM_IDENTIFIER_COMPONENT, PAYMENT_RATE_COMPONENT] STREAM_IDENTIFIER_COMPONENT, PAYMENT_RATE_COMPONENT]
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.file_manager = None self.file_manager = None
@property @property
@ -569,7 +578,7 @@ class PeerProtocolServerComponent(Component):
PAYMENT_RATE_COMPONENT] PAYMENT_RATE_COMPONENT]
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.lbry_server_port = None self.lbry_server_port = None
@property @property
@ -621,7 +630,7 @@ class ReflectorComponent(Component):
depends_on = [DHT_COMPONENT, BLOB_COMPONENT, FILE_MANAGER_COMPONENT] depends_on = [DHT_COMPONENT, BLOB_COMPONENT, FILE_MANAGER_COMPONENT]
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self.reflector_server_port = GCS('reflector_port') self.reflector_server_port = GCS('reflector_port')
self.reflector_server = None self.reflector_server = None
@ -655,7 +664,7 @@ class UPnPComponent(Component):
component_name = UPNP_COMPONENT component_name = UPNP_COMPONENT
def __init__(self, component_manager): def __init__(self, component_manager):
Component.__init__(self, component_manager) super().__init__(component_manager)
self._int_peer_port = GCS('peer_port') self._int_peer_port = GCS('peer_port')
self._int_dht_node_port = GCS('dht_node_port') self._int_dht_node_port = GCS('dht_node_port')
self.use_upnp = GCS('use_upnp') self.use_upnp = GCS('use_upnp')

View file

@ -1,5 +1,3 @@
# coding=utf-8
import binascii
import logging.handlers import logging.handlers
import mimetypes import mimetypes
import os import os
@ -7,12 +5,18 @@ import requests
import urllib import urllib
import json import json
import textwrap import textwrap
from operator import itemgetter
from binascii import hexlify, unhexlify
from copy import deepcopy from copy import deepcopy
from decimal import Decimal, InvalidOperation from decimal import Decimal, InvalidOperation
from twisted.web import server from twisted.web import server
from twisted.internet import defer, reactor from twisted.internet import defer, reactor
from twisted.internet.task import LoopingCall from twisted.internet.task import LoopingCall
from twisted.python.failure import Failure from twisted.python.failure import Failure
from typing import Union
from torba.constants import COIN
from lbryschema.claim import ClaimDict from lbryschema.claim import ClaimDict
from lbryschema.uri import parse_lbry_uri from lbryschema.uri import parse_lbry_uri
@ -41,6 +45,8 @@ from lbrynet.dht.error import TimeoutError
from lbrynet.core.Peer import Peer from lbrynet.core.Peer import Peer
from lbrynet.core.SinglePeerDownloader import SinglePeerDownloader from lbrynet.core.SinglePeerDownloader import SinglePeerDownloader
from lbrynet.core.client.StandaloneBlobDownloader import StandaloneBlobDownloader from lbrynet.core.client.StandaloneBlobDownloader import StandaloneBlobDownloader
from lbrynet.wallet.account import Account as LBCAccount
from torba.baseaccount import SingleKey, HierarchicalDeterministic
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
requires = AuthJSONRPCServer.requires requires = AuthJSONRPCServer.requires
@ -75,7 +81,7 @@ DIRECTION_DESCENDING = 'desc'
DIRECTIONS = DIRECTION_ASCENDING, DIRECTION_DESCENDING DIRECTIONS = DIRECTION_ASCENDING, DIRECTION_DESCENDING
class IterableContainer(object): class IterableContainer:
def __iter__(self): def __iter__(self):
for attr in dir(self): for attr in dir(self):
if not attr.startswith("_"): if not attr.startswith("_"):
@ -88,7 +94,7 @@ class IterableContainer(object):
return False return False
class Checker(object): class Checker:
"""The looping calls the daemon runs""" """The looping calls the daemon runs"""
INTERNET_CONNECTION = 'internet_connection_checker', 300 INTERNET_CONNECTION = 'internet_connection_checker', 300
# CONNECTION_STATUS = 'connection_status_checker' # CONNECTION_STATUS = 'connection_status_checker'
@ -120,7 +126,7 @@ class NoValidSearch(Exception):
pass pass
class CheckInternetConnection(object): class CheckInternetConnection:
def __init__(self, daemon): def __init__(self, daemon):
self.daemon = daemon self.daemon = daemon
@ -128,7 +134,7 @@ class CheckInternetConnection(object):
self.daemon.connected_to_internet = utils.check_connection() self.daemon.connected_to_internet = utils.check_connection()
class AlwaysSend(object): class AlwaysSend:
def __init__(self, value_generator, *args, **kwargs): def __init__(self, value_generator, *args, **kwargs):
self.value_generator = value_generator self.value_generator = value_generator
self.args = args self.args = args
@ -176,7 +182,9 @@ class WalletIsLocked(RequiredCondition):
@staticmethod @staticmethod
def evaluate(component): def evaluate(component):
return component.check_locked() d = component.check_locked()
d.addCallback(lambda r: not r)
return d
class Daemon(AuthJSONRPCServer): class Daemon(AuthJSONRPCServer):
@ -230,6 +238,13 @@ class Daemon(AuthJSONRPCServer):
# TODO: delete this # TODO: delete this
self.streams = {} self.streams = {}
@property
def ledger(self):
try:
return self.wallet.default_account.ledger
except AttributeError:
return None
@defer.inlineCallbacks @defer.inlineCallbacks
def setup(self): def setup(self):
log.info("Starting lbrynet-daemon") log.info("Starting lbrynet-daemon")
@ -239,7 +254,7 @@ class Daemon(AuthJSONRPCServer):
def _stop_streams(self): def _stop_streams(self):
"""stop pending GetStream downloads""" """stop pending GetStream downloads"""
for sd_hash, stream in self.streams.iteritems(): for sd_hash, stream in self.streams.items():
stream.cancel(reason="daemon shutdown") stream.cancel(reason="daemon shutdown")
def _shutdown(self): def _shutdown(self):
@ -269,7 +284,7 @@ class Daemon(AuthJSONRPCServer):
@defer.inlineCallbacks @defer.inlineCallbacks
def _get_stream_analytics_report(self, claim_dict): def _get_stream_analytics_report(self, claim_dict):
sd_hash = claim_dict.source_hash sd_hash = claim_dict.source_hash.decode()
try: try:
stream_hash = yield self.storage.get_stream_hash_for_sd_hash(sd_hash) stream_hash = yield self.storage.get_stream_hash_for_sd_hash(sd_hash)
except Exception: except Exception:
@ -348,49 +363,39 @@ class Daemon(AuthJSONRPCServer):
log.error('Failed to get %s (%s)', name, err) log.error('Failed to get %s (%s)', name, err)
if self.streams[sd_hash].downloader and self.streams[sd_hash].code != 'running': if self.streams[sd_hash].downloader and self.streams[sd_hash].code != 'running':
yield self.streams[sd_hash].downloader.stop(err) yield self.streams[sd_hash].downloader.stop(err)
result = {'error': err.message} result = {'error': str(err)}
finally: finally:
del self.streams[sd_hash] del self.streams[sd_hash]
defer.returnValue(result) defer.returnValue(result)
@defer.inlineCallbacks @defer.inlineCallbacks
def _publish_stream(self, name, bid, claim_dict, file_path=None, certificate_id=None, def _publish_stream(self, name, bid, claim_dict, file_path=None, certificate=None,
claim_address=None, change_address=None): claim_address=None, change_address=None):
publisher = Publisher( publisher = Publisher(
self.blob_manager, self.payment_rate_manager, self.storage, self.file_manager, self.wallet, certificate_id self.blob_manager, self.payment_rate_manager, self.storage, self.file_manager, self.wallet, certificate
) )
parse_lbry_uri(name) parse_lbry_uri(name)
if not file_path: if not file_path:
stream_hash = yield self.storage.get_stream_hash_for_sd_hash( stream_hash = yield self.storage.get_stream_hash_for_sd_hash(
claim_dict['stream']['source']['source']) claim_dict['stream']['source']['source'])
claim_out = yield publisher.publish_stream(name, bid, claim_dict, stream_hash, claim_address, tx = yield publisher.publish_stream(name, bid, claim_dict, stream_hash, claim_address)
change_address)
else: else:
claim_out = yield publisher.create_and_publish_stream(name, bid, claim_dict, file_path, tx = yield publisher.create_and_publish_stream(name, bid, claim_dict, file_path, claim_address)
claim_address, change_address)
if conf.settings['reflect_uploads']: if conf.settings['reflect_uploads']:
d = reupload.reflect_file(publisher.lbry_file) d = reupload.reflect_file(publisher.lbry_file)
d.addCallbacks(lambda _: log.info("Reflected new publication to lbry://%s", name), d.addCallbacks(lambda _: log.info("Reflected new publication to lbry://%s", name),
log.exception) log.exception)
self.analytics_manager.send_claim_action('publish') self.analytics_manager.send_claim_action('publish')
log.info("Success! Published to lbry://%s txid: %s nout: %d", name, claim_out['txid'], nout = 0
claim_out['nout']) txo = tx.outputs[nout]
defer.returnValue(claim_out) log.info("Success! Published to lbry://%s txid: %s nout: %d", name, tx.id, nout)
defer.returnValue({
@defer.inlineCallbacks "success": True,
def _resolve_name(self, name, force_refresh=False): "tx": tx,
"""Resolves a name. Checks the cache first before going out to the blockchain. "claim_id": txo.claim_id,
"claim_address": self.ledger.hash160_to_address(txo.script.values['pubkey_hash']),
Args: "output": tx.outputs[nout]
name: the lbry://<name> to resolve })
force_refresh: if True, always go out to the blockchain to resolve.
"""
parsed = parse_lbry_uri(name)
resolution = yield self.wallet.resolve(parsed.name, check_cache=not force_refresh)
if parsed.name in resolution:
result = resolution[parsed.name]
defer.returnValue(result)
def _get_or_download_sd_blob(self, blob, sd_hash): def _get_or_download_sd_blob(self, blob, sd_hash):
if blob: if blob:
@ -482,7 +487,7 @@ class Daemon(AuthJSONRPCServer):
Resolve a name and return the estimated stream cost Resolve a name and return the estimated stream cost
""" """
resolved = yield self.wallet.resolve(uri) resolved = (yield self.wallet.resolve(uri))[uri]
if resolved: if resolved:
claim_response = resolved[uri] claim_response = resolved[uri]
else: else:
@ -510,7 +515,7 @@ class Daemon(AuthJSONRPCServer):
@defer.inlineCallbacks @defer.inlineCallbacks
def _get_lbry_file_dict(self, lbry_file, full_status=False): def _get_lbry_file_dict(self, lbry_file, full_status=False):
key = binascii.b2a_hex(lbry_file.key) if lbry_file.key else None key = hexlify(lbry_file.key) if lbry_file.key else None
full_path = os.path.join(lbry_file.download_directory, lbry_file.file_name) full_path = os.path.join(lbry_file.download_directory, lbry_file.file_name)
mime_type = mimetypes.guess_type(full_path)[0] mime_type = mimetypes.guess_type(full_path)[0]
if os.path.isfile(full_path): if os.path.isfile(full_path):
@ -772,7 +777,6 @@ class Daemon(AuthJSONRPCServer):
log.info("Get version info: " + json.dumps(platform_info)) log.info("Get version info: " + json.dumps(platform_info))
return self._render_response(platform_info) return self._render_response(platform_info)
# @AuthJSONRPCServer.deprecated() # deprecated actually disables the call
def jsonrpc_report_bug(self, message=None): def jsonrpc_report_bug(self, message=None):
""" """
Report a bug to slack Report a bug to slack
@ -883,12 +887,12 @@ class Daemon(AuthJSONRPCServer):
'auto_renew_claim_height_delta': int 'auto_renew_claim_height_delta': int
} }
for key, setting_type in setting_types.iteritems(): for key, setting_type in setting_types.items():
if key in new_settings: if key in new_settings:
if isinstance(new_settings[key], setting_type): if isinstance(new_settings[key], setting_type):
conf.settings.update({key: new_settings[key]}, conf.settings.update({key: new_settings[key]},
data_types=(conf.TYPE_RUNTIME, conf.TYPE_PERSISTED)) data_types=(conf.TYPE_RUNTIME, conf.TYPE_PERSISTED))
elif setting_type is dict and isinstance(new_settings[key], (unicode, str)): elif setting_type is dict and isinstance(new_settings[key], str):
decoded = json.loads(str(new_settings[key])) decoded = json.loads(str(new_settings[key]))
conf.settings.update({key: decoded}, conf.settings.update({key: decoded},
data_types=(conf.TYPE_RUNTIME, conf.TYPE_PERSISTED)) data_types=(conf.TYPE_RUNTIME, conf.TYPE_PERSISTED))
@ -948,6 +952,7 @@ class Daemon(AuthJSONRPCServer):
return self._render_response(sorted([command for command in self.callable_methods.keys()])) return self._render_response(sorted([command for command in self.callable_methods.keys()]))
@requires(WALLET_COMPONENT) @requires(WALLET_COMPONENT)
@defer.inlineCallbacks
def jsonrpc_wallet_balance(self, address=None, include_unconfirmed=False): def jsonrpc_wallet_balance(self, address=None, include_unconfirmed=False):
""" """
Return the balance of the wallet Return the balance of the wallet
@ -963,11 +968,12 @@ class Daemon(AuthJSONRPCServer):
Returns: Returns:
(float) amount of lbry credits in wallet (float) amount of lbry credits in wallet
""" """
if address is None: if address is not None:
return self._render_response(float(self.wallet.get_balance())) raise NotImplementedError("Limiting by address needs to be re-implemented in new wallet.")
else: dewies = yield self.wallet.default_account.get_balance(
return self._render_response(float( 0 if include_unconfirmed else 6
self.wallet.get_address_balance(address, include_unconfirmed))) )
defer.returnValue(round(dewies / COIN, 3))
@requires(WALLET_COMPONENT) @requires(WALLET_COMPONENT)
@defer.inlineCallbacks @defer.inlineCallbacks
@ -997,7 +1003,6 @@ class Daemon(AuthJSONRPCServer):
defer.returnValue(response) defer.returnValue(response)
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED]) @requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks
def jsonrpc_wallet_decrypt(self): def jsonrpc_wallet_decrypt(self):
""" """
Decrypt an encrypted wallet, this will remove the wallet password Decrypt an encrypted wallet, this will remove the wallet password
@ -1011,13 +1016,9 @@ class Daemon(AuthJSONRPCServer):
Returns: Returns:
(bool) true if wallet is decrypted, otherwise false (bool) true if wallet is decrypted, otherwise false
""" """
return defer.succeed(self.wallet.decrypt_wallet())
result = self.wallet.decrypt_wallet()
response = yield self._render_response(result)
defer.returnValue(response)
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED]) @requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks
def jsonrpc_wallet_encrypt(self, new_password): def jsonrpc_wallet_encrypt(self, new_password):
""" """
Encrypt a wallet with a password, if the wallet is already encrypted this will update Encrypt a wallet with a password, if the wallet is already encrypted this will update
@ -1032,12 +1033,10 @@ class Daemon(AuthJSONRPCServer):
Returns: Returns:
(bool) true if wallet is decrypted, otherwise false (bool) true if wallet is decrypted, otherwise false
""" """
return defer.succeed(self.wallet.encrypt_wallet(new_password))
self.wallet.encrypt_wallet(new_password)
response = yield self._render_response(self.wallet.wallet.use_encryption)
defer.returnValue(response)
@defer.inlineCallbacks @defer.inlineCallbacks
@AuthJSONRPCServer.deprecated("stop")
def jsonrpc_daemon_stop(self): def jsonrpc_daemon_stop(self):
""" """
Stop lbrynet-daemon Stop lbrynet-daemon
@ -1051,11 +1050,24 @@ class Daemon(AuthJSONRPCServer):
Returns: Returns:
(string) Shutdown message (string) Shutdown message
""" """
return self.jsonrpc_stop()
def jsonrpc_stop(self):
"""
Stop lbrynet
Usage:
stop
Options:
None
Returns:
(string) Shutdown message
"""
log.info("Shutting down lbrynet daemon") log.info("Shutting down lbrynet daemon")
response = yield self._render_response("Shutting down")
reactor.callLater(0.1, reactor.fireSystemEvent, "shutdown") reactor.callLater(0.1, reactor.fireSystemEvent, "shutdown")
defer.returnValue(response) defer.returnValue("Shutting down")
@requires(FILE_MANAGER_COMPONENT) @requires(FILE_MANAGER_COMPONENT)
@defer.inlineCallbacks @defer.inlineCallbacks
@ -1148,7 +1160,10 @@ class Daemon(AuthJSONRPCServer):
""" """
try: try:
metadata = yield self._resolve_name(name, force_refresh=force) name = parse_lbry_uri(name).name
metadata = yield self.wallet.resolve(name, check_cache=not force)
if name in metadata:
metadata = metadata[name]
except UnknownNameError: except UnknownNameError:
log.info('Name %s is not known', name) log.info('Name %s is not known', name)
defer.returnValue(None) defer.returnValue(None)
@ -1361,7 +1376,7 @@ class Daemon(AuthJSONRPCServer):
resolved = resolved['claim'] resolved = resolved['claim']
txid, nout, name = resolved['txid'], resolved['nout'], resolved['name'] txid, nout, name = resolved['txid'], resolved['nout'], resolved['name']
claim_dict = ClaimDict.load_dict(resolved['value']) claim_dict = ClaimDict.load_dict(resolved['value'])
sd_hash = claim_dict.source_hash sd_hash = claim_dict.source_hash.decode()
if sd_hash in self.streams: if sd_hash in self.streams:
log.info("Already waiting on lbry://%s to start downloading", name) log.info("Already waiting on lbry://%s to start downloading", name)
@ -1532,7 +1547,6 @@ class Daemon(AuthJSONRPCServer):
'claim_id' : (str) claim ID of the resulting claim 'claim_id' : (str) claim ID of the resulting claim
} }
""" """
try: try:
parsed = parse_lbry_uri(channel_name) parsed = parse_lbry_uri(channel_name)
if not parsed.is_channel: if not parsed.is_channel:
@ -1541,29 +1555,24 @@ class Daemon(AuthJSONRPCServer):
raise Exception("Invalid channel uri") raise Exception("Invalid channel uri")
except (TypeError, URIParseError): except (TypeError, URIParseError):
raise Exception("Invalid channel name") raise Exception("Invalid channel name")
amount = self.get_dewies_or_error("amount", amount)
if amount <= 0: if amount <= 0:
raise Exception("Invalid amount") raise Exception("Invalid amount")
tx = yield self.wallet.claim_new_channel(channel_name, amount)
yield self.wallet.update_balance() self.wallet.save()
if amount >= self.wallet.get_balance():
balance = yield self.wallet.get_max_usable_balance_for_claim(channel_name)
max_bid_amount = balance - MAX_UPDATE_FEE_ESTIMATE
if balance <= MAX_UPDATE_FEE_ESTIMATE:
raise InsufficientFundsError(
"Insufficient funds, please deposit additional LBC. Minimum additional LBC needed {}"
.format(MAX_UPDATE_FEE_ESTIMATE - balance))
elif amount > max_bid_amount:
raise InsufficientFundsError(
"Please wait for any pending bids to resolve or lower the bid value. "
"Currently the maximum amount you can specify for this channel is {}"
.format(max_bid_amount)
)
result = yield self.wallet.claim_new_channel(channel_name, amount)
self.analytics_manager.send_new_channel() self.analytics_manager.send_new_channel()
log.info("Claimed a new channel! Result: %s", result) nout = 0
response = yield self._render_response(result) txo = tx.outputs[nout]
defer.returnValue(response) log.info("Claimed a new channel! lbry://%s txid: %s nout: %d", channel_name, tx.id, nout)
defer.returnValue({
"success": True,
"tx": tx,
"claim_id": txo.claim_id,
"claim_address": self.ledger.hash160_to_address(txo.script.values['pubkey_hash']),
"output": txo
})
@requires(WALLET_COMPONENT) @requires(WALLET_COMPONENT)
@defer.inlineCallbacks @defer.inlineCallbacks
@ -1735,23 +1744,28 @@ class Daemon(AuthJSONRPCServer):
if bid <= 0.0: if bid <= 0.0:
raise ValueError("Bid value must be greater than 0.0") raise ValueError("Bid value must be greater than 0.0")
bid = int(bid * COIN)
for address in [claim_address, change_address]: for address in [claim_address, change_address]:
if address is not None: if address is not None:
# raises an error if the address is invalid # raises an error if the address is invalid
decode_address(address) decode_address(address)
yield self.wallet.update_balance() available = yield self.wallet.default_account.get_balance()
if bid >= self.wallet.get_balance(): if bid >= available:
balance = yield self.wallet.get_max_usable_balance_for_claim(name) # TODO: add check for existing claim balance
max_bid_amount = balance - MAX_UPDATE_FEE_ESTIMATE #balance = yield self.wallet.get_max_usable_balance_for_claim(name)
if balance <= MAX_UPDATE_FEE_ESTIMATE: #max_bid_amount = balance - MAX_UPDATE_FEE_ESTIMATE
raise InsufficientFundsError( #if balance <= MAX_UPDATE_FEE_ESTIMATE:
"Insufficient funds, please deposit additional LBC. Minimum additional LBC needed {}" raise InsufficientFundsError(
.format(MAX_UPDATE_FEE_ESTIMATE - balance)) "Insufficient funds, please deposit additional LBC. Minimum additional LBC needed {}"
elif bid > max_bid_amount: .format(round((bid - available)/COIN + 0.01, 2))
raise InsufficientFundsError( )
"Please lower the bid value, the maximum amount you can specify for this claim is {}." # .format(MAX_UPDATE_FEE_ESTIMATE - balance))
.format(max_bid_amount)) #elif bid > max_bid_amount:
# raise InsufficientFundsError(
# "Please lower the bid value, the maximum amount you can specify for this claim is {}."
# .format(max_bid_amount))
metadata = metadata or {} metadata = metadata or {}
if fee is not None: if fee is not None:
@ -1789,7 +1803,7 @@ class Daemon(AuthJSONRPCServer):
log.warning("Stripping empty fee from published metadata") log.warning("Stripping empty fee from published metadata")
del metadata['fee'] del metadata['fee']
elif 'address' not in metadata['fee']: elif 'address' not in metadata['fee']:
address = yield self.wallet.get_least_used_address() address = yield self.wallet.default_account.receiving.get_or_create_usable_address()
metadata['fee']['address'] = address metadata['fee']['address'] = address
if 'fee' in metadata and 'version' not in metadata['fee']: if 'fee' in metadata and 'version' not in metadata['fee']:
metadata['fee']['version'] = '_0_0_1' metadata['fee']['version'] = '_0_0_1'
@ -1841,24 +1855,19 @@ class Daemon(AuthJSONRPCServer):
'channel_name': channel_name 'channel_name': channel_name
}) })
if channel_id: certificate = None
certificate_id = channel_id if channel_name:
elif channel_name: certificates = yield self.wallet.get_certificates(channel_name)
certificate_id = None for cert in certificates:
my_certificates = yield self.wallet.channel_list() if cert.claim_id == channel_id:
for certificate in my_certificates: certificate = cert
if channel_name == certificate['name']:
certificate_id = certificate['claim_id']
break break
if not certificate_id: if certificate is None:
raise Exception("Cannot publish using channel %s" % channel_name) raise Exception("Cannot publish using channel %s" % channel_name)
else:
certificate_id = None
result = yield self._publish_stream(name, bid, claim_dict, file_path, certificate_id, result = yield self._publish_stream(name, bid, claim_dict, file_path, certificate,
claim_address, change_address) claim_address, change_address)
response = yield self._render_response(result) defer.returnValue(result)
defer.returnValue(response)
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED]) @requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks @defer.inlineCallbacks
@ -1889,9 +1898,13 @@ class Daemon(AuthJSONRPCServer):
if nout is None and txid is not None: if nout is None and txid is not None:
raise Exception('Must specify nout') raise Exception('Must specify nout')
result = yield self.wallet.abandon_claim(claim_id, txid, nout) tx = yield self.wallet.abandon_claim(claim_id, txid, nout)
self.analytics_manager.send_claim_action('abandon') self.analytics_manager.send_claim_action('abandon')
defer.returnValue(result) defer.returnValue({
"success": True,
"tx": tx,
"claim_id": claim_id
})
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED]) @requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks @defer.inlineCallbacks
@ -2148,8 +2161,7 @@ class Daemon(AuthJSONRPCServer):
except URIParseError: except URIParseError:
results[chan_uri] = {"error": "%s is not a valid uri" % chan_uri} results[chan_uri] = {"error": "%s is not a valid uri" % chan_uri}
resolved = yield self.wallet.resolve(*valid_uris, check_cache=False, page=page, resolved = yield self.wallet.resolve(*valid_uris, page=page, page_size=page_size)
page_size=page_size)
for u in resolved: for u in resolved:
if 'error' in resolved[u]: if 'error' in resolved[u]:
results[u] = resolved[u] results[u] = resolved[u]
@ -2345,6 +2357,7 @@ class Daemon(AuthJSONRPCServer):
""" """
def _disp(address): def _disp(address):
address = str(address)
log.info("Got unused wallet address: " + address) log.info("Got unused wallet address: " + address)
return defer.succeed(address) return defer.succeed(address)
@ -2353,36 +2366,6 @@ class Daemon(AuthJSONRPCServer):
d.addCallback(lambda address: self._render_response(address)) d.addCallback(lambda address: self._render_response(address))
return d return d
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@AuthJSONRPCServer.deprecated("wallet_send")
@defer.inlineCallbacks
def jsonrpc_send_amount_to_address(self, amount, address):
"""
Queue a payment of credits to an address
Usage:
send_amount_to_address (<amount> | --amount=<amount>) (<address> | --address=<address>)
Options:
--amount=<amount> : (float) amount to send
--address=<address> : (str) address to send credits to
Returns:
(bool) true if payment successfully scheduled
"""
if amount < 0:
raise NegativeFundsError()
elif not amount:
raise NullFundsError()
reserved_points = self.wallet.reserve_points(address, amount)
if reserved_points is None:
raise InsufficientFundsError()
yield self.wallet.send_points_to_address(reserved_points, amount)
self.analytics_manager.send_credits_sent()
defer.returnValue(True)
@requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED]) @requires(WALLET_COMPONENT, conditions=[WALLET_IS_UNLOCKED])
@defer.inlineCallbacks @defer.inlineCallbacks
def jsonrpc_wallet_send(self, amount, address=None, claim_id=None): def jsonrpc_wallet_send(self, amount, address=None, claim_id=None):
@ -2402,7 +2385,16 @@ class Daemon(AuthJSONRPCServer):
Returns: Returns:
If sending to an address: If sending to an address:
(bool) true if payment successfully scheduled (dict) true if payment successfully scheduled
{
"hex": (str) raw transaction,
"inputs": (list) inputs(dict) used for the transaction,
"outputs": (list) outputs(dict) for the transaction,
"total_fee": (int) fee in dewies,
"total_input": (int) total of inputs in dewies,
"total_output": (int) total of outputs in dewies(input - fees),
"txid": (str) txid of the transaction,
}
If sending a claim tip: If sending a claim tip:
(dict) Dictionary containing the result of the support (dict) Dictionary containing the result of the support
@ -2413,25 +2405,26 @@ class Daemon(AuthJSONRPCServer):
} }
""" """
amount = self.get_dewies_or_error("amount", amount)
if not amount:
raise NullFundsError
elif amount < 0:
raise NegativeFundsError()
if address and claim_id: if address and claim_id:
raise Exception("Given both an address and a claim id") raise Exception("Given both an address and a claim id")
elif not address and not claim_id: elif not address and not claim_id:
raise Exception("Not given an address or a claim id") raise Exception("Not given an address or a claim id")
try:
amount = Decimal(str(amount))
except InvalidOperation:
raise TypeError("Amount does not represent a valid decimal.")
if amount < 0:
raise NegativeFundsError()
elif not amount:
raise NullFundsError()
if address: if address:
# raises an error if the address is invalid # raises an error if the address is invalid
decode_address(address) decode_address(address)
result = yield self.jsonrpc_send_amount_to_address(amount, address)
reserved_points = self.wallet.reserve_points(address, amount)
if reserved_points is None:
raise InsufficientFundsError()
result = yield self.wallet.send_points_to_address(reserved_points, amount)
self.analytics_manager.send_credits_sent()
else: else:
validate_claim_id(claim_id) validate_claim_id(claim_id)
result = yield self.wallet.tip_claim(claim_id, amount) result = yield self.wallet.tip_claim(claim_id, amount)
@ -2442,7 +2435,7 @@ class Daemon(AuthJSONRPCServer):
@defer.inlineCallbacks @defer.inlineCallbacks
def jsonrpc_wallet_prefill_addresses(self, num_addresses, amount, no_broadcast=False): def jsonrpc_wallet_prefill_addresses(self, num_addresses, amount, no_broadcast=False):
""" """
Create new addresses, each containing `amount` credits Create new UTXOs, each containing `amount` credits
Usage: Usage:
wallet_prefill_addresses [--no_broadcast] wallet_prefill_addresses [--no_broadcast]
@ -2457,17 +2450,12 @@ class Daemon(AuthJSONRPCServer):
Returns: Returns:
(dict) the resulting transaction (dict) the resulting transaction
""" """
if amount < 0:
raise NegativeFundsError()
elif not amount:
raise NullFundsError()
broadcast = not no_broadcast broadcast = not no_broadcast
tx = yield self.wallet.create_addresses_with_balance( return self.jsonrpc_fund(self.wallet.default_account.name,
num_addresses, amount, broadcast=broadcast) self.wallet.default_account.name,
tx['broadcast'] = broadcast amount=amount,
defer.returnValue(tx) outputs=num_addresses,
broadcast=broadcast)
@requires(WALLET_COMPONENT) @requires(WALLET_COMPONENT)
@defer.inlineCallbacks @defer.inlineCallbacks
@ -2628,7 +2616,7 @@ class Daemon(AuthJSONRPCServer):
if not utils.is_valid_blobhash(blob_hash): if not utils.is_valid_blobhash(blob_hash):
raise Exception("invalid blob hash") raise Exception("invalid blob hash")
finished_deferred = self.dht_node.iterativeFindValue(binascii.unhexlify(blob_hash)) finished_deferred = self.dht_node.iterativeFindValue(unhexlify(blob_hash))
def trap_timeout(err): def trap_timeout(err):
err.trap(defer.TimeoutError) err.trap(defer.TimeoutError)
@ -2639,7 +2627,7 @@ class Daemon(AuthJSONRPCServer):
peers = yield finished_deferred peers = yield finished_deferred
results = [ results = [
{ {
"node_id": node_id.encode('hex'), "node_id": hexlify(node_id).decode(),
"host": host, "host": host,
"port": port "port": port
} }
@ -2748,7 +2736,7 @@ class Daemon(AuthJSONRPCServer):
""" """
if uri or stream_hash or sd_hash: if uri or stream_hash or sd_hash:
if uri: if uri:
metadata = yield self._resolve_name(uri) metadata = (yield self.wallet.resolve(uri))[uri]
sd_hash = utils.get_sd_hash(metadata) sd_hash = utils.get_sd_hash(metadata)
stream_hash = yield self.storage.get_stream_hash_for_sd_hash(sd_hash) stream_hash = yield self.storage.get_stream_hash_for_sd_hash(sd_hash)
elif stream_hash: elif stream_hash:
@ -2768,7 +2756,7 @@ class Daemon(AuthJSONRPCServer):
if sd_hash in self.blob_manager.blobs: if sd_hash in self.blob_manager.blobs:
blobs = [self.blob_manager.blobs[sd_hash]] + blobs blobs = [self.blob_manager.blobs[sd_hash]] + blobs
else: else:
blobs = self.blob_manager.blobs.itervalues() blobs = self.blob_manager.blobs.values()
if needed: if needed:
blobs = [blob for blob in blobs if not blob.get_is_verified()] blobs = [blob for blob in blobs if not blob.get_is_verified()]
@ -2844,21 +2832,21 @@ class Daemon(AuthJSONRPCServer):
contact = None contact = None
if node_id and address and port: if node_id and address and port:
contact = self.dht_node.contact_manager.get_contact(node_id.decode('hex'), address, int(port)) contact = self.dht_node.contact_manager.get_contact(unhexlify(node_id), address, int(port))
if not contact: if not contact:
contact = self.dht_node.contact_manager.make_contact( contact = self.dht_node.contact_manager.make_contact(
node_id.decode('hex'), address, int(port), self.dht_node._protocol unhexlify(node_id), address, int(port), self.dht_node._protocol
) )
if not contact: if not contact:
try: try:
contact = yield self.dht_node.findContact(node_id.decode('hex')) contact = yield self.dht_node.findContact(unhexlify(node_id))
except TimeoutError: except TimeoutError:
result = {'error': 'timeout finding peer'} result = {'error': 'timeout finding peer'}
defer.returnValue(result) defer.returnValue(result)
if not contact: if not contact:
defer.returnValue({'error': 'peer not found'}) defer.returnValue({'error': 'peer not found'})
try: try:
result = yield contact.ping() result = (yield contact.ping()).decode()
except TimeoutError: except TimeoutError:
result = {'error': 'ping timeout'} result = {'error': 'ping timeout'}
defer.returnValue(result) defer.returnValue(result)
@ -2892,51 +2880,34 @@ class Daemon(AuthJSONRPCServer):
"node_id": (str) the local dht node id "node_id": (str) the local dht node id
} }
""" """
result = {} result = {}
data_store = self.dht_node._dataStore._dict data_store = self.dht_node._dataStore
datastore_len = len(data_store)
hosts = {} hosts = {}
if datastore_len: for k, v in data_store.items():
for k, v in data_store.iteritems(): for contact in map(itemgetter(0), v):
for contact, value, lastPublished, originallyPublished, originalPublisherID in v: hosts.setdefault(contact, []).append(hexlify(k).decode())
if contact in hosts:
blobs = hosts[contact]
else:
blobs = []
blobs.append(k.encode('hex'))
hosts[contact] = blobs
contact_set = [] contact_set = set()
blob_hashes = [] blob_hashes = set()
result['buckets'] = {} result['buckets'] = {}
for i in range(len(self.dht_node._routingTable._buckets)): for i in range(len(self.dht_node._routingTable._buckets)):
for contact in self.dht_node._routingTable._buckets[i]._contacts: for contact in self.dht_node._routingTable._buckets[i]._contacts:
contacts = result['buckets'].get(i, []) blobs = list(hosts.pop(contact)) if contact in hosts else []
if contact in hosts: blob_hashes.update(blobs)
blobs = hosts[contact]
del hosts[contact]
else:
blobs = []
host = { host = {
"address": contact.address, "address": contact.address,
"port": contact.port, "port": contact.port,
"node_id": contact.id.encode("hex"), "node_id": hexlify(contact.id).decode(),
"blobs": blobs, "blobs": blobs,
} }
for blob_hash in blobs: result['buckets'].setdefault(i, []).append(host)
if blob_hash not in blob_hashes: contact_set.add(hexlify(contact.id).decode())
blob_hashes.append(blob_hash)
contacts.append(host)
result['buckets'][i] = contacts
if contact.id.encode('hex') not in contact_set:
contact_set.append(contact.id.encode("hex"))
result['contacts'] = contact_set result['contacts'] = list(contact_set)
result['blob_hashes'] = blob_hashes result['blob_hashes'] = list(blob_hashes)
result['node_id'] = self.dht_node.node_id.encode('hex') result['node_id'] = hexlify(self.dht_node.node_id).decode()
return self._render_response(result) return self._render_response(result)
# the single peer downloader needs wallet access # the single peer downloader needs wallet access
@ -3039,7 +3010,7 @@ class Daemon(AuthJSONRPCServer):
} }
try: try:
resolved_result = yield self.wallet.resolve(uri) resolved_result = (yield self.wallet.resolve(uri))[uri]
response['did_resolve'] = True response['did_resolve'] = True
except UnknownNameError: except UnknownNameError:
response['error'] = "Failed to resolve name" response['error'] = "Failed to resolve name"
@ -3089,29 +3060,245 @@ class Daemon(AuthJSONRPCServer):
response['head_blob_availability'].get('is_available') response['head_blob_availability'].get('is_available')
defer.returnValue(response) defer.returnValue(response)
@defer.inlineCallbacks #######################
def jsonrpc_cli_test_command(self, pos_arg, pos_args=[], pos_arg2=None, pos_arg3=None, # New Wallet Commands #
a_arg=False, b_arg=False): #######################
# TODO:
# Delete this after all commands have been migrated
# and refactored.
@requires("wallet")
def jsonrpc_account(self, account_name, create=False, delete=False, single_key=False,
seed=None, private_key=None, public_key=None,
change_gap=None, change_max_uses=None,
receiving_gap=None, receiving_max_uses=None,
rename=None, default=False):
""" """
This command is only for testing the CLI argument parsing Create new account or update some settings on an existing account. If no
creation or modification options are provided but the account exists then
it will just displayed the unmodified settings for the account.
Usage: Usage:
cli_test_command [--a_arg] [--b_arg] (<pos_arg> | --pos_arg=<pos_arg>) account [--create | --delete] (<account_name> | --account_name=<account_name>) [--single_key]
[<pos_args>...] [--pos_arg2=<pos_arg2>] [--seed=<seed> | --private_key=<private_key> | --public_key=<public_key>]
[--pos_arg3=<pos_arg3>] [--change_gap=<change_gap>] [--change_max_uses=<change_max_uses>]
[--receiving_gap=<receiving_gap>] [--receiving_max_uses=<receiving_max_uses>]
[--rename=<rename>] [--default]
Options: Options:
--a_arg : (bool) a arg --account_name=<account_name> : (str) name of the account to create or update
--b_arg : (bool) b arg --create : (bool) create the account
--pos_arg=<pos_arg> : (int) pos arg --delete : (bool) delete the account
--pos_args=<pos_args> : (int) pos args --single_key : (bool) create single key account, default is multi-key
--pos_arg2=<pos_arg2> : (int) pos arg 2 --seed=<seed> : (str) seed to generate new account from
--pos_arg3=<pos_arg3> : (int) pos arg 3 --private_key=<private_key> : (str) private key for new account
--public_key=<public_key> : (str) public key for new account
--receiving_gap=<receiving_gap> : (int) set the gap for receiving addresses
--receiving_max_uses=<receiving_max_uses> : (int) set the maximum number of times to
use a receiving address
--change_gap=<change_gap> : (int) set the gap for change addresses
--change_max_uses=<change_max_uses> : (int) set the maximum number of times to
use a change address
--rename=<rename> : (str) change name of existing account
--default : (bool) make this account the default
Returns: Returns:
pos args (map) new or updated account details
""" """
out = (pos_arg, pos_args, pos_arg2, pos_arg3, a_arg, b_arg) wallet = self.wallet.default_wallet
response = yield self._render_response(out) if create:
defer.returnValue(response) self.error_if_account_exists(account_name)
if single_key:
address_generator = {'name': SingleKey.name}
else:
address_generator = {
'name': HierarchicalDeterministic.name,
'receiving': {
'gap': receiving_gap or 20,
'maximum_uses_per_address': receiving_max_uses or 1},
'change': {
'gap': change_gap or 6,
'maximum_uses_per_address': change_max_uses or 1}
}
ledger = self.wallet.get_or_create_ledger('lbc_mainnet')
if seed or private_key or public_key:
account = LBCAccount.from_dict(ledger, wallet, {
'name': account_name,
'seed': seed,
'private_key': private_key,
'public_key': public_key,
'address_generator': address_generator
})
else:
account = LBCAccount.generate(
ledger, wallet, account_name, address_generator)
wallet.save()
elif delete:
account = self.get_account_or_error('account_name', account_name)
wallet.accounts.remove(account)
wallet.save()
return "Account '{}' deleted.".format(account_name)
else:
change_made = False
account = self.get_account_or_error('account_name', account_name)
if rename is not None:
self.error_if_account_exists(rename)
account.name = rename
change_made = True
if account.receiving.name == HierarchicalDeterministic.name:
address_changes = {
'change': {'gap': change_gap, 'maximum_uses_per_address': change_max_uses},
'receiving': {'gap': receiving_gap, 'maximum_uses_per_address': receiving_max_uses},
}
for chain_name in address_changes:
chain = getattr(account, chain_name)
for attr, value in address_changes[chain_name].items():
if value is not None:
setattr(chain, attr, value)
change_made = True
if change_made:
wallet.save()
if default:
wallet.accounts.remove(account)
wallet.accounts.insert(0, account)
wallet.save()
result = account.to_dict()
result.pop('certificates', None)
result['is_default'] = wallet.accounts[0] == account
return result
@requires("wallet")
def jsonrpc_balance(self, account_name=None, confirmations=6, include_reserved=False,
include_claims=False):
"""
Return the balance of an individual account or all of the accounts.
Usage:
balance [<account_name>] [--confirmations=<confirmations>]
[--include_reserved] [--include_claims]
Options:
--account=<account_name> : (str) If provided only the balance for this
account will be given
--confirmations=<confirmations> : (int) required confirmations (default: 6)
--include_reserved : (bool) include reserved UTXOs (default: false)
--include_claims : (bool) include claims, requires than a
LBC account is specified (default: false)
Returns:
(map) balance of account(s)
"""
if account_name:
for account in self.wallet.accounts:
if account.name == account_name:
if include_claims and not isinstance(account, LBCAccount):
raise Exception(
"'--include-claims' requires specifying an LBC ledger account. "
"Found '{}', but it's an {} ledger account."
.format(account_name, account.ledger.symbol)
)
args = {
'confirmations': confirmations,
'include_reserved': include_reserved
}
if include_claims:
args['include_claims'] = True
return account.get_balance(**args)
raise Exception("Couldn't find an account named: '{}'.".format(account_name))
else:
if include_claims:
raise Exception("'--include-claims' requires specifying an LBC account.")
return self.wallet.get_balances(confirmations)
@requires("wallet")
def jsonrpc_max_address_gap(self, account_name):
"""
Finds ranges of consecutive addresses that are unused and returns the length
of the longest such range: for change and receiving address chains. This is
useful to figure out ideal values to set for 'receiving_gap' and 'change_gap'
account settings.
Usage:
max_address_gap (<account_name> | --account=<account_name>)
Options:
--account=<account_name> : (str) account for which to get max gaps
Returns:
(map) maximum gap for change and receiving addresses
"""
return self.get_account_or_error('account', account_name).get_max_gap()
@requires("wallet")
def jsonrpc_fund(self, to_account, from_account, amount=0,
everything=False, outputs=1, broadcast=False):
"""
Transfer some amount (or --everything) to an account from another
account (can be the same account). Amounts are interpreted as LBC.
You can also spread the transfer across a number of --outputs (cannot
be used together with --everything).
Usage:
fund (<to_account> | --to_account=<to_account>)
(<from_account> | --from_account=<from_account>)
(<amount> | --amount=<amount> | --everything)
[<outputs> | --outputs=<outputs>]
[--broadcast]
Options:
--to_account=<to_account> : (str) send to this account
--from_account=<from_account> : (str) spend from this account
--amount=<amount> : (str) the amount to transfer lbc
--everything : (bool) transfer everything (excluding claims), default: false.
--outputs=<outputs> : (int) split payment across many outputs, default: 1.
--broadcast : (bool) actually broadcast the transaction, default: false.
Returns:
(map) maximum gap for change and receiving addresses
"""
to_account = self.get_account_or_error('to_account', to_account)
from_account = self.get_account_or_error('from_account', from_account)
amount = self.get_dewies_or_error('amount', amount) if amount else None
if not isinstance(outputs, int):
raise ValueError("--outputs must be an integer.")
if everything and outputs > 1:
raise ValueError("Using --everything along with --outputs is not supported.")
return from_account.fund(
to_account=to_account, amount=amount, everything=everything,
outputs=outputs, broadcast=broadcast
)
def get_account_or_error(self, argument: str, account_name: str, lbc_only=False):
for account in self.wallet.default_wallet.accounts:
if account.name == account_name:
if lbc_only and not isinstance(account, LBCAccount):
raise ValueError(
"Found '{}', but it's an {} ledger account. "
"'{}' requires specifying an LBC ledger account."
.format(account_name, account.ledger.symbol, argument)
)
return account
raise ValueError("Couldn't find an account named: '{}'.".format(account_name))
def error_if_account_exists(self, account_name: str):
for account in self.wallet.default_wallet.accounts:
if account.name == account_name:
raise ValueError("Account with name '{}' already exists.".format(account_name))
@staticmethod
def get_dewies_or_error(argument: str, amount: Union[str, int]):
if isinstance(amount, str):
if '.' in amount:
return int(Decimal(amount) * COIN)
elif amount.isdigit():
amount = int(amount)
if isinstance(amount, int):
return amount * COIN
raise ValueError("Invalid value for '{}' argument: {}".format(argument, amount))
def loggly_time_string(dt): def loggly_time_string(dt):
@ -3170,7 +3357,7 @@ def create_key_getter(field):
try: try:
value = value[key] value = value[key]
except KeyError as e: except KeyError as e:
errmsg = 'Failed to get "{}", key "{}" was not found.' errmsg = "Failed to get '{}', key {} was not found."
raise Exception(errmsg.format(field, e.message)) raise Exception(errmsg.format(field, str(e)))
return value return value
return key_getter return key_getter

View file

@ -1,224 +0,0 @@
import json
import os
import sys
import colorama
from docopt import docopt
from collections import OrderedDict
from lbrynet import conf
from lbrynet.core import utils
from lbrynet.daemon.auth.client import JSONRPCException, LBRYAPIClient, AuthAPIClient
from lbrynet.daemon.Daemon import Daemon
from lbrynet.core.system_info import get_platform
from jsonrpc.common import RPCError
from requests.exceptions import ConnectionError
from urllib2 import URLError, HTTPError
from httplib import UNAUTHORIZED
def remove_brackets(key):
if key.startswith("<") and key.endswith(">"):
return str(key[1:-1])
return key
def set_kwargs(parsed_args):
kwargs = OrderedDict()
for key, arg in parsed_args.iteritems():
if arg is None:
continue
elif key.startswith("--") and remove_brackets(key[2:]) not in kwargs:
k = remove_brackets(key[2:])
elif remove_brackets(key) not in kwargs:
k = remove_brackets(key)
kwargs[k] = guess_type(arg, k)
return kwargs
def main():
argv = sys.argv[1:]
# check if a config file has been specified. If so, shift
# all the arguments so that the parsing can continue without
# noticing
if len(argv) and argv[0] == "--conf":
if len(argv) < 2:
print_error("No config file specified for --conf option")
print_help()
return
conf.conf_file = argv[1]
argv = argv[2:]
if len(argv):
method, args = argv[0], argv[1:]
else:
print_help()
return
if method in ['help', '--help', '-h']:
if len(args) == 1:
print_help_for_command(args[0])
else:
print_help()
return
elif method in ['version', '--version']:
print utils.json_dumps_pretty(get_platform(get_ip=False))
return
if method not in Daemon.callable_methods:
if method not in Daemon.deprecated_methods:
print_error("\"%s\" is not a valid command." % method)
return
new_method = Daemon.deprecated_methods[method]._new_command
print_error("\"%s\" is deprecated, using \"%s\"." % (method, new_method))
method = new_method
fn = Daemon.callable_methods[method]
parsed = docopt(fn.__doc__, args)
kwargs = set_kwargs(parsed)
colorama.init()
conf.initialize_settings()
try:
api = LBRYAPIClient.get_client()
api.status()
except (URLError, ConnectionError) as err:
if isinstance(err, HTTPError) and err.code == UNAUTHORIZED:
api = AuthAPIClient.config()
# this can happen if the daemon is using auth with the --http-auth flag
# when the config setting is to not use it
try:
api.status()
except:
print_error("Daemon requires authentication, but none was provided.",
suggest_help=False)
return 1
else:
print_error("Could not connect to daemon. Are you sure it's running?",
suggest_help=False)
return 1
# TODO: check if port is bound. Error if its not
try:
result = api.call(method, kwargs)
if isinstance(result, basestring):
# printing the undumped string is prettier
print result
else:
print utils.json_dumps_pretty(result)
except (RPCError, KeyError, JSONRPCException, HTTPError) as err:
if isinstance(err, HTTPError):
error_body = err.read()
try:
error_data = json.loads(error_body)
except ValueError:
print (
"There was an error, and the response was not valid JSON.\n" +
"Raw JSONRPC response:\n" + error_body
)
return 1
print_error(error_data['error']['message'] + "\n", suggest_help=False)
if 'data' in error_data['error'] and 'traceback' in error_data['error']['data']:
print "Here's the traceback for the error you encountered:"
print "\n".join(error_data['error']['data']['traceback'])
print_help_for_command(method)
elif isinstance(err, RPCError):
print_error(err.msg, suggest_help=False)
# print_help_for_command(method)
else:
print_error("Something went wrong\n", suggest_help=False)
print str(err)
return 1
def guess_type(x, key=None):
if not isinstance(x, (unicode, str)):
return x
if key in ('uri', 'channel_name', 'name', 'file_name', 'download_directory'):
return x
if x in ('true', 'True', 'TRUE'):
return True
if x in ('false', 'False', 'FALSE'):
return False
if '.' in x:
try:
return float(x)
except ValueError:
# not a float
pass
try:
return int(x)
except ValueError:
return x
def print_help_suggestion():
print "See `{} help` for more information.".format(os.path.basename(sys.argv[0]))
def print_error(message, suggest_help=True):
error_style = colorama.Style.BRIGHT + colorama.Fore.RED
print error_style + "ERROR: " + message + colorama.Style.RESET_ALL
if suggest_help:
print_help_suggestion()
def print_help():
print "\n".join([
"NAME",
" lbrynet-cli - LBRY command line client.",
"",
"USAGE",
" lbrynet-cli [--conf <config file>] <command> [<args>]",
"",
"EXAMPLES",
" lbrynet-cli commands # list available commands",
" lbrynet-cli status # get daemon status",
" lbrynet-cli --conf ~/l1.conf status # like above but using ~/l1.conf as config file",
" lbrynet-cli resolve_name what # resolve a name",
" lbrynet-cli help resolve_name # get help for a command",
])
def print_help_for_command(command):
fn = Daemon.callable_methods.get(command)
if fn:
print "Help for %s method:\n%s" % (command, fn.__doc__)
def wrap_list_to_term_width(l, width=None, separator=', ', prefix=''):
if width is None:
try:
_, width = os.popen('stty size', 'r').read().split()
width = int(width)
except:
pass
if not width:
width = 80
lines = []
curr_line = ''
for item in l:
new_line = curr_line + item + separator
if len(new_line) + len(prefix) > width:
lines.append(curr_line)
curr_line = item + separator
else:
curr_line = new_line
lines.append(curr_line)
ret = prefix + ("\n" + prefix).join(lines)
if ret.endswith(separator):
ret = ret[:-len(separator)]
return ret
if __name__ == '__main__':
sys.exit(main())

View file

@ -1,11 +1,11 @@
# -*- coding: utf-8 -*-
import sys import sys
import code import code
import argparse import argparse
import asyncio
import logging.handlers import logging.handlers
from exceptions import SystemExit
from twisted.internet import defer, reactor, threads from twisted.internet import defer, reactor, threads
from aiohttp import client_exceptions
from lbrynet import analytics from lbrynet import analytics
from lbrynet import conf from lbrynet import conf
from lbrynet.core import utils from lbrynet.core import utils
@ -13,8 +13,6 @@ from lbrynet.core import log_support
from lbrynet.daemon.auth.client import LBRYAPIClient from lbrynet.daemon.auth.client import LBRYAPIClient
from lbrynet.daemon.Daemon import Daemon from lbrynet.daemon.Daemon import Daemon
get_client = LBRYAPIClient.get_client
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -117,12 +115,12 @@ def get_methods(daemon):
locs = {} locs = {}
def wrapped(name, fn): def wrapped(name, fn):
client = get_client() client = LBRYAPIClient.get_client()
_fn = getattr(client, name) _fn = getattr(client, name)
_fn.__doc__ = fn.__doc__ _fn.__doc__ = fn.__doc__
return {name: _fn} return {name: _fn}
for method_name, method in daemon.callable_methods.iteritems(): for method_name, method in daemon.callable_methods.items():
locs.update(wrapped(method_name, method)) locs.update(wrapped(method_name, method))
return locs return locs
@ -133,14 +131,14 @@ def run_terminal(callable_methods, started_daemon, quiet=False):
def help(method_name=None): def help(method_name=None):
if not method_name: if not method_name:
print "Available api functions: " print("Available api functions: ")
for name in callable_methods: for name in callable_methods:
print "\t%s" % name print("\t%s" % name)
return return
if method_name not in callable_methods: if method_name not in callable_methods:
print "\"%s\" is not a recognized api function" print("\"%s\" is not a recognized api function")
return return
print callable_methods[method_name].__doc__ print(callable_methods[method_name].__doc__)
return return
locs.update({'help': help}) locs.update({'help': help})
@ -148,7 +146,7 @@ def run_terminal(callable_methods, started_daemon, quiet=False):
if started_daemon: if started_daemon:
def exit(status=None): def exit(status=None):
if not quiet: if not quiet:
print "Stopping lbrynet-daemon..." print("Stopping lbrynet-daemon...")
callable_methods['daemon_stop']() callable_methods['daemon_stop']()
return sys.exit(status) return sys.exit(status)
@ -158,7 +156,7 @@ def run_terminal(callable_methods, started_daemon, quiet=False):
try: try:
reactor.callLater(0, reactor.stop) reactor.callLater(0, reactor.stop)
except Exception as err: except Exception as err:
print "error stopping reactor: ", err print("error stopping reactor: {}".format(err))
return sys.exit(status) return sys.exit(status)
locs.update({'exit': exit}) locs.update({'exit': exit})
@ -184,21 +182,21 @@ def threaded_terminal(started_daemon, quiet):
d.addErrback(log.exception) d.addErrback(log.exception)
def start_lbrynet_console(quiet, use_existing_daemon, useauth): async def start_lbrynet_console(quiet, use_existing_daemon, useauth):
if not utils.check_connection(): if not utils.check_connection():
print "Not connected to internet, unable to start" print("Not connected to internet, unable to start")
raise Exception("Not connected to internet, unable to start") raise Exception("Not connected to internet, unable to start")
if not quiet: if not quiet:
print "Starting lbrynet-console..." print("Starting lbrynet-console...")
try: try:
get_client().status() await LBRYAPIClient.get_client().status()
d = defer.succeed(False) d = defer.succeed(False)
if not quiet: if not quiet:
print "lbrynet-daemon is already running, connecting to it..." print("lbrynet-daemon is already running, connecting to it...")
except: except client_exceptions.ClientConnectorError:
if not use_existing_daemon: if not use_existing_daemon:
if not quiet: if not quiet:
print "Starting lbrynet-daemon..." print("Starting lbrynet-daemon...")
analytics_manager = analytics.Manager.new_instance() analytics_manager = analytics.Manager.new_instance()
d = start_server_and_listen(useauth, analytics_manager, quiet) d = start_server_and_listen(useauth, analytics_manager, quiet)
else: else:
@ -225,7 +223,8 @@ def main():
"--http-auth", dest="useauth", action="store_true", default=conf.settings['use_auth_http'] "--http-auth", dest="useauth", action="store_true", default=conf.settings['use_auth_http']
) )
args = parser.parse_args() args = parser.parse_args()
start_lbrynet_console(args.quiet, args.use_existing_daemon, args.useauth) loop = asyncio.get_event_loop()
loop.run_until_complete(start_lbrynet_console(args.quiet, args.use_existing_daemon, args.useauth))
reactor.run() reactor.run()

View file

@ -13,7 +13,6 @@ import argparse
import logging.handlers import logging.handlers
from twisted.internet import reactor from twisted.internet import reactor
from jsonrpc.proxy import JSONRPCProxy
from lbrynet import conf from lbrynet import conf
from lbrynet.core import utils, system_info from lbrynet.core import utils, system_info
@ -26,20 +25,13 @@ def test_internet_connection():
return utils.check_connection() return utils.check_connection()
def start(): def start(argv=None, conf_path=None):
"""The primary entry point for launching the daemon.""" if conf_path is not None:
conf.conf_file = conf_path
# postpone loading the config file to after the CLI arguments conf.initialize_settings()
# have been parsed, as they may contain an alternate config file location
conf.initialize_settings(load_conf_file=False)
parser = argparse.ArgumentParser(description="Launch lbrynet-daemon") parser = argparse.ArgumentParser()
parser.add_argument(
"--conf",
help="specify an alternative configuration file",
type=str,
default=None
)
parser.add_argument( parser.add_argument(
"--http-auth", dest="useauth", action="store_true", default=conf.settings['use_auth_http'] "--http-auth", dest="useauth", action="store_true", default=conf.settings['use_auth_http']
) )
@ -57,15 +49,14 @@ def start():
help='Show daemon version and quit' help='Show daemon version and quit'
) )
args = parser.parse_args() args = parser.parse_args(argv)
update_settings_from_args(args) if args.useauth:
conf.settings.update({'use_auth_http': args.useauth}, data_types=(conf.TYPE_CLI,))
conf.settings.load_conf_file_settings()
if args.version: if args.version:
version = system_info.get_platform(get_ip=False) version = system_info.get_platform(get_ip=False)
version['installation_id'] = conf.settings.installation_id version['installation_id'] = conf.settings.installation_id
print utils.json_dumps_pretty(version) print(utils.json_dumps_pretty(version))
return return
lbrynet_log = conf.settings.get_log_filename() lbrynet_log = conf.settings.get_log_filename()
@ -73,14 +64,6 @@ def start():
log_support.configure_loggly_handler() log_support.configure_loggly_handler()
log.debug('Final Settings: %s', conf.settings.get_current_settings_dict()) log.debug('Final Settings: %s', conf.settings.get_current_settings_dict())
try:
log.debug('Checking for an existing lbrynet daemon instance')
JSONRPCProxy.from_url(conf.settings.get_api_connection_string()).status()
log.info("lbrynet-daemon is already running")
return
except Exception:
log.debug('No lbrynet instance found, continuing to start')
log.info("Starting lbrynet-daemon from command line") log.info("Starting lbrynet-daemon from command line")
if test_internet_connection(): if test_internet_connection():
@ -89,17 +72,3 @@ def start():
reactor.run() reactor.run()
else: else:
log.info("Not connected to internet, unable to start") log.info("Not connected to internet, unable to start")
def update_settings_from_args(args):
if args.conf:
conf.conf_file = args.conf
if args.useauth:
conf.settings.update({
'use_auth_http': args.useauth,
}, data_types=(conf.TYPE_CLI,))
if __name__ == "__main__":
start()

View file

@ -29,7 +29,7 @@ STREAM_STAGES = [
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class GetStream(object): class GetStream:
def __init__(self, sd_identifier, wallet, exchange_rate_manager, blob_manager, peer_finder, rate_limiter, def __init__(self, sd_identifier, wallet, exchange_rate_manager, blob_manager, peer_finder, rate_limiter,
payment_rate_manager, storage, max_key_fee, disable_max_key_fee, data_rate=None, timeout=None): payment_rate_manager, storage, max_key_fee, disable_max_key_fee, data_rate=None, timeout=None):
@ -162,7 +162,7 @@ class GetStream(object):
@defer.inlineCallbacks @defer.inlineCallbacks
def _initialize(self, stream_info): def _initialize(self, stream_info):
# Set sd_hash and return key_fee from stream_info # Set sd_hash and return key_fee from stream_info
self.sd_hash = stream_info.source_hash self.sd_hash = stream_info.source_hash.decode()
key_fee = None key_fee = None
if stream_info.has_fee: if stream_info.has_fee:
key_fee = yield self.check_fee_and_convert(stream_info.source_fee) key_fee = yield self.check_fee_and_convert(stream_info.source_fee)

View file

@ -15,7 +15,7 @@ BITTREX_FEE = 0.0025
COINBASE_FEE = 0.0 # add fee COINBASE_FEE = 0.0 # add fee
class ExchangeRate(object): class ExchangeRate:
def __init__(self, market, spot, ts): def __init__(self, market, spot, ts):
if not int(time.time()) - ts < 600: if not int(time.time()) - ts < 600:
raise ValueError('The timestamp is too dated.') raise ValueError('The timestamp is too dated.')
@ -34,7 +34,7 @@ class ExchangeRate(object):
return {'spot': self.spot, 'ts': self.ts} return {'spot': self.spot, 'ts': self.ts}
class MarketFeed(object): class MarketFeed:
REQUESTS_TIMEOUT = 20 REQUESTS_TIMEOUT = 20
EXCHANGE_RATE_UPDATE_RATE_SEC = 300 EXCHANGE_RATE_UPDATE_RATE_SEC = 300
@ -96,8 +96,7 @@ class MarketFeed(object):
class BittrexFeed(MarketFeed): class BittrexFeed(MarketFeed):
def __init__(self): def __init__(self):
MarketFeed.__init__( super().__init__(
self,
"BTCLBC", "BTCLBC",
"Bittrex", "Bittrex",
"https://bittrex.com/api/v1.1/public/getmarkethistory", "https://bittrex.com/api/v1.1/public/getmarkethistory",
@ -122,8 +121,7 @@ class BittrexFeed(MarketFeed):
class LBRYioFeed(MarketFeed): class LBRYioFeed(MarketFeed):
def __init__(self): def __init__(self):
MarketFeed.__init__( super().__init__(
self,
"BTCLBC", "BTCLBC",
"lbry.io", "lbry.io",
"https://api.lbry.io/lbc/exchange_rate", "https://api.lbry.io/lbc/exchange_rate",
@ -140,8 +138,7 @@ class LBRYioFeed(MarketFeed):
class LBRYioBTCFeed(MarketFeed): class LBRYioBTCFeed(MarketFeed):
def __init__(self): def __init__(self):
MarketFeed.__init__( super().__init__(
self,
"USDBTC", "USDBTC",
"lbry.io", "lbry.io",
"https://api.lbry.io/lbc/exchange_rate", "https://api.lbry.io/lbc/exchange_rate",
@ -161,8 +158,7 @@ class LBRYioBTCFeed(MarketFeed):
class CryptonatorBTCFeed(MarketFeed): class CryptonatorBTCFeed(MarketFeed):
def __init__(self): def __init__(self):
MarketFeed.__init__( super().__init__(
self,
"USDBTC", "USDBTC",
"cryptonator.com", "cryptonator.com",
"https://api.cryptonator.com/api/ticker/usd-btc", "https://api.cryptonator.com/api/ticker/usd-btc",
@ -183,8 +179,7 @@ class CryptonatorBTCFeed(MarketFeed):
class CryptonatorFeed(MarketFeed): class CryptonatorFeed(MarketFeed):
def __init__(self): def __init__(self):
MarketFeed.__init__( super().__init__(
self,
"BTCLBC", "BTCLBC",
"cryptonator.com", "cryptonator.com",
"https://api.cryptonator.com/api/ticker/btc-lbc", "https://api.cryptonator.com/api/ticker/btc-lbc",
@ -203,7 +198,7 @@ class CryptonatorFeed(MarketFeed):
return defer.succeed(float(json_response['ticker']['price'])) return defer.succeed(float(json_response['ticker']['price']))
class ExchangeRateManager(object): class ExchangeRateManager:
def __init__(self): def __init__(self):
self.market_feeds = [ self.market_feeds = [
LBRYioBTCFeed(), LBRYioBTCFeed(),

View file

@ -4,25 +4,23 @@ import os
from twisted.internet import defer from twisted.internet import defer
from lbrynet.core import file_utils
from lbrynet.file_manager.EncryptedFileCreator import create_lbry_file from lbrynet.file_manager.EncryptedFileCreator import create_lbry_file
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
class Publisher(object): class Publisher:
def __init__(self, blob_manager, payment_rate_manager, storage, lbry_file_manager, wallet, certificate_id): def __init__(self, blob_manager, payment_rate_manager, storage, lbry_file_manager, wallet, certificate):
self.blob_manager = blob_manager self.blob_manager = blob_manager
self.payment_rate_manager = payment_rate_manager self.payment_rate_manager = payment_rate_manager
self.storage = storage self.storage = storage
self.lbry_file_manager = lbry_file_manager self.lbry_file_manager = lbry_file_manager
self.wallet = wallet self.wallet = wallet
self.certificate_id = certificate_id self.certificate = certificate
self.lbry_file = None self.lbry_file = None
@defer.inlineCallbacks @defer.inlineCallbacks
def create_and_publish_stream(self, name, bid, claim_dict, file_path, claim_address=None, def create_and_publish_stream(self, name, bid, claim_dict, file_path, holding_address=None):
change_address=None):
"""Create lbry file and make claim""" """Create lbry file and make claim"""
log.info('Starting publish for %s', name) log.info('Starting publish for %s', name)
if not os.path.isfile(file_path): if not os.path.isfile(file_path):
@ -31,7 +29,7 @@ class Publisher(object):
raise Exception("Cannot publish empty file {}".format(file_path)) raise Exception("Cannot publish empty file {}".format(file_path))
file_name = os.path.basename(file_path) file_name = os.path.basename(file_path)
with file_utils.get_read_handle(file_path) as read_handle: with open(file_path, 'rb') as read_handle:
self.lbry_file = yield create_lbry_file( self.lbry_file = yield create_lbry_file(
self.blob_manager, self.storage, self.payment_rate_manager, self.lbry_file_manager, file_name, self.blob_manager, self.storage, self.payment_rate_manager, self.lbry_file_manager, file_name,
read_handle read_handle
@ -43,11 +41,13 @@ class Publisher(object):
claim_dict['stream']['source']['sourceType'] = 'lbry_sd_hash' claim_dict['stream']['source']['sourceType'] = 'lbry_sd_hash'
claim_dict['stream']['source']['contentType'] = get_content_type(file_path) claim_dict['stream']['source']['contentType'] = get_content_type(file_path)
claim_dict['stream']['source']['version'] = "_0_0_1" # need current version here claim_dict['stream']['source']['version'] = "_0_0_1" # need current version here
claim_out = yield self.make_claim(name, bid, claim_dict, claim_address, change_address) tx = yield self.wallet.claim_name(
name, bid, claim_dict, self.certificate, holding_address
)
# check if we have a file already for this claim (if this is a publish update with a new stream) # check if we have a file already for this claim (if this is a publish update with a new stream)
old_stream_hashes = yield self.storage.get_old_stream_hashes_for_claim_id( old_stream_hashes = yield self.storage.get_old_stream_hashes_for_claim_id(
claim_out['claim_id'], self.lbry_file.stream_hash tx.outputs[0].claim_id, self.lbry_file.stream_hash
) )
if old_stream_hashes: if old_stream_hashes:
for lbry_file in filter(lambda l: l.stream_hash in old_stream_hashes, for lbry_file in filter(lambda l: l.stream_hash in old_stream_hashes,
@ -56,28 +56,22 @@ class Publisher(object):
log.info("Removed old stream for claim update: %s", lbry_file.stream_hash) log.info("Removed old stream for claim update: %s", lbry_file.stream_hash)
yield self.storage.save_content_claim( yield self.storage.save_content_claim(
self.lbry_file.stream_hash, "%s:%i" % (claim_out['txid'], claim_out['nout']) self.lbry_file.stream_hash, tx.outputs[0].id
) )
defer.returnValue(claim_out) defer.returnValue(tx)
@defer.inlineCallbacks @defer.inlineCallbacks
def publish_stream(self, name, bid, claim_dict, stream_hash, claim_address=None, change_address=None): def publish_stream(self, name, bid, claim_dict, stream_hash, holding_address=None):
"""Make a claim without creating a lbry file""" """Make a claim without creating a lbry file"""
claim_out = yield self.make_claim(name, bid, claim_dict, claim_address, change_address) tx = yield self.wallet.claim_name(
name, bid, claim_dict, self.certificate, holding_address
)
if stream_hash: # the stream_hash returned from the db will be None if this isn't a stream we have if stream_hash: # the stream_hash returned from the db will be None if this isn't a stream we have
yield self.storage.save_content_claim( yield self.storage.save_content_claim(
stream_hash, "%s:%i" % (claim_out['txid'], claim_out['nout']) stream_hash.decode(), tx.outputs[0].id
) )
self.lbry_file = [f for f in self.lbry_file_manager.lbry_files if f.stream_hash == stream_hash][0] self.lbry_file = [f for f in self.lbry_file_manager.lbry_files if f.stream_hash == stream_hash][0]
defer.returnValue(claim_out) defer.returnValue(tx)
@defer.inlineCallbacks
def make_claim(self, name, bid, claim_dict, claim_address=None, change_address=None):
claim_out = yield self.wallet.claim_name(name, bid, claim_dict,
certificate_id=self.certificate_id,
claim_address=claim_address,
change_address=change_address)
defer.returnValue(claim_out)
def get_content_type(filename): def get_content_type(filename):

View file

@ -1,4 +1 @@
from lbrynet import custom_logger from . import Components # register Component classes
import Components # register Component classes
from lbrynet.daemon.auth.client import LBRYAPIClient
get_client = LBRYAPIClient.get_client

View file

@ -9,7 +9,7 @@ log = logging.getLogger(__name__)
@implementer(portal.IRealm) @implementer(portal.IRealm)
class HttpPasswordRealm(object): class HttpPasswordRealm:
def __init__(self, resource): def __init__(self, resource):
self.resource = resource self.resource = resource
@ -21,7 +21,7 @@ class HttpPasswordRealm(object):
@implementer(checkers.ICredentialsChecker) @implementer(checkers.ICredentialsChecker)
class PasswordChecker(object): class PasswordChecker:
credentialInterfaces = (credentials.IUsernamePassword,) credentialInterfaces = (credentials.IUsernamePassword,)
def __init__(self, passwords): def __init__(self, passwords):
@ -39,8 +39,12 @@ class PasswordChecker(object):
return cls(passwords) return cls(passwords)
def requestAvatarId(self, creds): def requestAvatarId(self, creds):
if creds.username in self.passwords: password_dict_bytes = {}
pw = self.passwords.get(creds.username) for api in self.passwords:
password_dict_bytes.update({api.encode(): self.passwords[api].encode()})
if creds.username in password_dict_bytes:
pw = password_dict_bytes.get(creds.username)
pw_match = creds.checkPassword(pw) pw_match = creds.checkPassword(pw)
if pw_match: if pw_match:
return defer.succeed(creds.username) return defer.succeed(creds.username)

View file

@ -1,10 +1,9 @@
import os import os
import json import json
import urlparse import aiohttp
import requests
from requests.cookies import RequestsCookieJar
import logging import logging
from jsonrpc.proxy import JSONRPCProxy from urllib.parse import urlparse
from lbrynet import conf from lbrynet import conf
from lbrynet.daemon.auth.util import load_api_keys, APIKey, API_KEY_NAME, get_auth_message from lbrynet.daemon.auth.util import load_api_keys, APIKey, API_KEY_NAME, get_auth_message
@ -13,28 +12,50 @@ USER_AGENT = "AuthServiceProxy/0.1"
TWISTED_SESSION = "TWISTED_SESSION" TWISTED_SESSION = "TWISTED_SESSION"
LBRY_SECRET = "LBRY_SECRET" LBRY_SECRET = "LBRY_SECRET"
HTTP_TIMEOUT = 30 HTTP_TIMEOUT = 30
SCHEME = "http"
def copy_cookies(cookies):
result = RequestsCookieJar()
result.update(cookies)
return result
class JSONRPCException(Exception): class JSONRPCException(Exception):
def __init__(self, rpc_error): def __init__(self, rpc_error):
Exception.__init__(self) super().__init__()
self.error = rpc_error self.error = rpc_error
class AuthAPIClient(object): class UnAuthAPIClient:
def __init__(self, key, timeout, connection, count, cookies, url, login_url): def __init__(self, host, port, session):
self.host = host
self.port = port
self.session = session
self.scheme = SCHEME
def __getattr__(self, method):
async def f(*args, **kwargs):
return await self.call(method, [args, kwargs])
return f
@classmethod
async def from_url(cls, url):
url_fragment = urlparse(url)
host = url_fragment.hostname
port = url_fragment.port
session = aiohttp.ClientSession()
return cls(host, port, session)
async def call(self, method, params=None):
message = {'method': method, 'params': params}
async with self.session.get('{}://{}:{}'.format(self.scheme, self.host, self.port), json=message) as resp:
return await resp.json()
class AuthAPIClient:
def __init__(self, key, session, cookies, url, login_url):
self.session = session
self.__api_key = key self.__api_key = key
self.__service_url = login_url self.__login_url = login_url
self.__id_count = count self.__id_count = 0
self.__url = url self.__url = url
self.__conn = connection self.__cookies = cookies
self.__cookies = copy_cookies(cookies)
def __getattr__(self, name): def __getattr__(self, name):
if name.startswith('__') and name.endswith('__'): if name.startswith('__') and name.endswith('__'):
@ -45,9 +66,10 @@ class AuthAPIClient(object):
return f return f
def call(self, method, params=None): async def call(self, method, params=None):
params = params or {} params = params or {}
self.__id_count += 1 self.__id_count += 1
pre_auth_post_data = { pre_auth_post_data = {
'version': '2', 'version': '2',
'method': method, 'method': method,
@ -55,85 +77,60 @@ class AuthAPIClient(object):
'id': self.__id_count 'id': self.__id_count
} }
to_auth = get_auth_message(pre_auth_post_data) to_auth = get_auth_message(pre_auth_post_data)
pre_auth_post_data.update({'hmac': self.__api_key.get_hmac(to_auth)}) auth_msg = self.__api_key.get_hmac(to_auth).decode()
pre_auth_post_data.update({'hmac': auth_msg})
post_data = json.dumps(pre_auth_post_data) post_data = json.dumps(pre_auth_post_data)
cookies = copy_cookies(self.__cookies)
req = requests.Request( headers = {
method='POST', url=self.__service_url, data=post_data, cookies=cookies, 'Host': self.__url.hostname,
headers={ 'User-Agent': USER_AGENT,
'Host': self.__url.hostname, 'Content-type': 'application/json'
'User-Agent': USER_AGENT, }
'Content-type': 'application/json'
} async with self.session.post(self.__login_url, data=post_data, headers=headers) as resp:
) if resp is None:
http_response = self.__conn.send(req.prepare()) raise JSONRPCException({'code': -342, 'message': 'missing HTTP response from server'})
if http_response is None: resp.raise_for_status()
raise JSONRPCException({
'code': -342, 'message': 'missing HTTP response from server'}) next_secret = resp.headers.get(LBRY_SECRET, False)
http_response.raise_for_status() if next_secret:
next_secret = http_response.headers.get(LBRY_SECRET, False) self.__api_key.secret = next_secret
if next_secret:
self.__api_key.secret = next_secret return await resp.json()
self.__cookies = copy_cookies(http_response.cookies)
response = http_response.json()
if response.get('error') is not None:
raise JSONRPCException(response['error'])
elif 'result' not in response:
raise JSONRPCException({
'code': -343, 'message': 'missing JSON-RPC result'})
else:
return response['result']
@classmethod @classmethod
def config(cls, key_name=None, key=None, pw_path=None, timeout=HTTP_TIMEOUT, connection=None, count=0, async def get_client(cls, key_name=None):
cookies=None, auth=None, url=None, login_url=None):
api_key_name = key_name or API_KEY_NAME api_key_name = key_name or API_KEY_NAME
pw_path = os.path.join(conf.settings['data_dir'], ".api_keys") if not pw_path else pw_path
if not key:
keys = load_api_keys(pw_path)
api_key = keys.get(api_key_name, False)
else:
api_key = APIKey(name=api_key_name, secret=key)
if login_url is None:
service_url = "http://%s:%s@%s:%i/%s" % (api_key_name,
api_key.secret,
conf.settings['api_host'],
conf.settings['api_port'],
conf.settings['API_ADDRESS'])
else:
service_url = login_url
id_count = count
if auth is None and connection is None and cookies is None and url is None: pw_path = os.path.join(conf.settings['data_dir'], ".api_keys")
# This is a new client instance, start an authenticated session keys = load_api_keys(pw_path)
url = urlparse.urlparse(service_url) api_key = keys.get(api_key_name, False)
conn = requests.Session()
req = requests.Request(method='POST', login_url = "http://{}:{}@{}:{}".format(api_key_name, api_key.secret, conf.settings['api_host'],
url=service_url, conf.settings['api_port'])
headers={'Host': url.hostname, url = urlparse(login_url)
'User-Agent': USER_AGENT,
'Content-type': 'application/json'},) headers = {
r = req.prepare() 'Host': url.hostname,
http_response = conn.send(r) 'User-Agent': USER_AGENT,
cookies = RequestsCookieJar() 'Content-type': 'application/json'
cookies.update(http_response.cookies) }
uid = cookies.get(TWISTED_SESSION)
api_key = APIKey.new(seed=uid) session = aiohttp.ClientSession()
else:
# This is a client that already has a session, use it async with session.post(login_url, headers=headers) as r:
conn = connection cookies = r.cookies
if not cookies.get(LBRY_SECRET):
raise Exception("Missing cookie") uid = cookies.get(TWISTED_SESSION).value
secret = cookies.get(LBRY_SECRET) api_key = APIKey.new(seed=uid.encode())
api_key = APIKey(secret, api_key_name) return cls(api_key, session, cookies, url, login_url)
return cls(api_key, timeout, conn, id_count, cookies, url, service_url)
class LBRYAPIClient(object): class LBRYAPIClient:
@staticmethod @staticmethod
def get_client(): def get_client(conf_path=None):
conf.conf_file = conf_path
if not conf.settings: if not conf.settings:
conf.initialize_settings() conf.initialize_settings()
return AuthAPIClient.config() if conf.settings['use_auth_http'] else \ return AuthAPIClient.get_client() if conf.settings['use_auth_http'] else \
JSONRPCProxy.from_url(conf.settings.get_api_connection_string()) UnAuthAPIClient.from_url(conf.settings.get_api_connection_string())

View file

@ -14,8 +14,8 @@ log = logging.getLogger(__name__)
class AuthJSONRPCResource(resource.Resource): class AuthJSONRPCResource(resource.Resource):
def __init__(self, protocol): def __init__(self, protocol):
resource.Resource.__init__(self) resource.Resource.__init__(self)
self.putChild("", protocol) self.putChild(b"", protocol)
self.putChild(conf.settings['API_ADDRESS'], protocol) self.putChild(conf.settings['API_ADDRESS'].encode(), protocol)
def getChild(self, name, request): def getChild(self, name, request):
request.setHeader('cache-control', 'no-cache, no-store, must-revalidate') request.setHeader('cache-control', 'no-cache, no-store, must-revalidate')

View file

@ -1,13 +1,11 @@
import logging import logging
import urlparse from six.moves.urllib import parse as urlparse
import json import json
import inspect import inspect
import signal import signal
from decimal import Decimal
from functools import wraps from functools import wraps
from zope.interface import implements from twisted.web import server
from twisted.web import server, resource
from twisted.internet import defer from twisted.internet import defer
from twisted.python.failure import Failure from twisted.python.failure import Failure
from twisted.internet.error import ConnectionDone, ConnectionLost from twisted.internet.error import ConnectionDone, ConnectionLost
@ -20,16 +18,16 @@ from lbrynet.core import utils
from lbrynet.core.Error import ComponentsNotStarted, ComponentStartConditionNotMet from lbrynet.core.Error import ComponentsNotStarted, ComponentStartConditionNotMet
from lbrynet.core.looping_call_manager import LoopingCallManager from lbrynet.core.looping_call_manager import LoopingCallManager
from lbrynet.daemon.ComponentManager import ComponentManager from lbrynet.daemon.ComponentManager import ComponentManager
from lbrynet.undecorated import undecorated from .util import APIKey, get_auth_message, LBRY_SECRET
from .util import APIKey, get_auth_message from .undecorated import undecorated
from .client import LBRY_SECRET
from .factory import AuthJSONRPCResource from .factory import AuthJSONRPCResource
from lbrynet.daemon.json_response_encoder import JSONResponseEncoder
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
EMPTY_PARAMS = [{}] EMPTY_PARAMS = [{}]
class JSONRPCError(object): class JSONRPCError:
# http://www.jsonrpc.org/specification#error_object # http://www.jsonrpc.org/specification#error_object
CODE_PARSE_ERROR = -32700 # Invalid JSON. Error while parsing the JSON text. CODE_PARSE_ERROR = -32700 # Invalid JSON. Error while parsing the JSON text.
CODE_INVALID_REQUEST = -32600 # The JSON sent is not a valid Request object. CODE_INVALID_REQUEST = -32600 # The JSON sent is not a valid Request object.
@ -59,7 +57,7 @@ class JSONRPCError(object):
} }
def __init__(self, message, code=CODE_APPLICATION_ERROR, traceback=None, data=None): def __init__(self, message, code=CODE_APPLICATION_ERROR, traceback=None, data=None):
assert isinstance(code, (int, long)), "'code' must be an int" assert isinstance(code, int), "'code' must be an int"
assert (data is None or isinstance(data, dict)), "'data' must be None or a dict" assert (data is None or isinstance(data, dict)), "'data' must be None or a dict"
self.code = code self.code = code
if message is None: if message is None:
@ -83,13 +81,8 @@ class JSONRPCError(object):
} }
@classmethod @classmethod
def create_from_exception(cls, exception, code=CODE_APPLICATION_ERROR, traceback=None): def create_from_exception(cls, message, code=CODE_APPLICATION_ERROR, traceback=None):
return cls(exception.message, code=code, traceback=traceback) return cls(message, code=code, traceback=traceback)
def default_decimal(obj):
if isinstance(obj, Decimal):
return float(obj)
class UnknownAPIMethodError(Exception): class UnknownAPIMethodError(Exception):
@ -111,8 +104,7 @@ def jsonrpc_dumps_pretty(obj, **kwargs):
else: else:
data = {"jsonrpc": "2.0", "result": obj, "id": id_} data = {"jsonrpc": "2.0", "result": obj, "id": id_}
return json.dumps(data, cls=jsonrpclib.JSONRPCEncoder, sort_keys=True, indent=2, return json.dumps(data, cls=JSONResponseEncoder, sort_keys=True, indent=2, **kwargs) + "\n"
separators=(',', ': '), **kwargs) + "\n"
class JSONRPCServerType(type): class JSONRPCServerType(type):
@ -131,20 +123,19 @@ class JSONRPCServerType(type):
return klass return klass
class AuthorizedBase(object): class AuthorizedBase(metaclass=JSONRPCServerType):
__metaclass__ = JSONRPCServerType
@staticmethod @staticmethod
def deprecated(new_command=None): def deprecated(new_command=None):
def _deprecated_wrapper(f): def _deprecated_wrapper(f):
f._new_command = new_command f.new_command = new_command
f._deprecated = True f._deprecated = True
return f return f
return _deprecated_wrapper return _deprecated_wrapper
@staticmethod @staticmethod
def requires(*components, **conditions): def requires(*components, **conditions):
if conditions and ["conditions"] != conditions.keys(): if conditions and ["conditions"] != list(conditions.keys()):
raise SyntaxError("invalid conditions argument") raise SyntaxError("invalid conditions argument")
condition_names = conditions.get("conditions", []) condition_names = conditions.get("conditions", [])
@ -189,7 +180,7 @@ class AuthJSONRPCServer(AuthorizedBase):
the server will randomize the shared secret and return the new value under the LBRY_SECRET header, which the the server will randomize the shared secret and return the new value under the LBRY_SECRET header, which the
client uses to generate the token for their next request. client uses to generate the token for their next request.
""" """
implements(resource.IResource) #implements(resource.IResource)
isLeaf = True isLeaf = True
allowed_during_startup = [] allowed_during_startup = []
@ -205,20 +196,23 @@ class AuthJSONRPCServer(AuthorizedBase):
skip_components=to_skip or [], skip_components=to_skip or [],
reactor=reactor reactor=reactor
) )
self.looping_call_manager = LoopingCallManager({n: lc for n, (lc, t) in (looping_calls or {}).iteritems()}) self.looping_call_manager = LoopingCallManager({n: lc for n, (lc, t) in (looping_calls or {}).items()})
self._looping_call_times = {n: t for n, (lc, t) in (looping_calls or {}).iteritems()} self._looping_call_times = {n: t for n, (lc, t) in (looping_calls or {}).items()}
self._use_authentication = use_authentication or conf.settings['use_auth_http'] self._use_authentication = use_authentication or conf.settings['use_auth_http']
self.listening_port = None
self._component_setup_deferred = None self._component_setup_deferred = None
self.announced_startup = False self.announced_startup = False
self.sessions = {} self.sessions = {}
self.server = None
@defer.inlineCallbacks @defer.inlineCallbacks
def start_listening(self): def start_listening(self):
from twisted.internet import reactor, error as tx_error from twisted.internet import reactor, error as tx_error
try: try:
reactor.listenTCP( self.server = self.get_server_factory()
conf.settings['api_port'], self.get_server_factory(), interface=conf.settings['api_host'] self.listening_port = reactor.listenTCP(
conf.settings['api_port'], self.server, interface=conf.settings['api_host']
) )
log.info("lbrynet API listening on TCP %s:%i", conf.settings['api_host'], conf.settings['api_port']) log.info("lbrynet API listening on TCP %s:%i", conf.settings['api_host'], conf.settings['api_port'])
yield self.setup() yield self.setup()
@ -241,7 +235,7 @@ class AuthJSONRPCServer(AuthorizedBase):
reactor.addSystemEventTrigger('before', 'shutdown', self._shutdown) reactor.addSystemEventTrigger('before', 'shutdown', self._shutdown)
if not self.analytics_manager.is_started: if not self.analytics_manager.is_started:
self.analytics_manager.start() self.analytics_manager.start()
for lc_name, lc_time in self._looping_call_times.iteritems(): for lc_name, lc_time in self._looping_call_times.items():
self.looping_call_manager.start(lc_name, lc_time) self.looping_call_manager.start(lc_name, lc_time)
def update_attribute(setup_result, component): def update_attribute(setup_result, component):
@ -259,7 +253,12 @@ class AuthJSONRPCServer(AuthorizedBase):
# ignore INT/TERM signals once shutdown has started # ignore INT/TERM signals once shutdown has started
signal.signal(signal.SIGINT, self._already_shutting_down) signal.signal(signal.SIGINT, self._already_shutting_down)
signal.signal(signal.SIGTERM, self._already_shutting_down) signal.signal(signal.SIGTERM, self._already_shutting_down)
if self.listening_port:
self.listening_port.stopListening()
self.looping_call_manager.shutdown() self.looping_call_manager.shutdown()
if self.server is not None:
for session in list(self.server.sessions.values()):
session.expire()
if self.analytics_manager: if self.analytics_manager:
self.analytics_manager.shutdown() self.analytics_manager.shutdown()
try: try:
@ -287,8 +286,8 @@ class AuthJSONRPCServer(AuthorizedBase):
request.setHeader(LBRY_SECRET, self.sessions.get(session_id).secret) request.setHeader(LBRY_SECRET, self.sessions.get(session_id).secret)
@staticmethod @staticmethod
def _render_message(request, message): def _render_message(request, message: str):
request.write(message) request.write(message.encode())
request.finish() request.finish()
def _render_error(self, failure, request, id_): def _render_error(self, failure, request, id_):
@ -299,8 +298,15 @@ class AuthJSONRPCServer(AuthorizedBase):
error = failure.check(JSONRPCError) error = failure.check(JSONRPCError)
if error is None: if error is None:
# maybe its a twisted Failure with another type of error # maybe its a twisted Failure with another type of error
error = JSONRPCError(failure.getErrorMessage() or failure.type.__name__, if hasattr(failure.type, "code"):
traceback=failure.getTraceback()) error_code = failure.type.code
else:
error_code = JSONRPCError.CODE_APPLICATION_ERROR
error = JSONRPCError.create_from_exception(
failure.getErrorMessage() or failure.type.__name__,
code=error_code,
traceback=failure.getTraceback()
)
if not failure.check(ComponentsNotStarted, ComponentStartConditionNotMet): if not failure.check(ComponentsNotStarted, ComponentStartConditionNotMet):
log.warning("error processing api request: %s\ntraceback: %s", error.message, log.warning("error processing api request: %s\ntraceback: %s", error.message,
"\n".join(error.traceback)) "\n".join(error.traceback))
@ -308,7 +314,7 @@ class AuthJSONRPCServer(AuthorizedBase):
# last resort, just cast it as a string # last resort, just cast it as a string
error = JSONRPCError(str(failure)) error = JSONRPCError(str(failure))
response_content = jsonrpc_dumps_pretty(error, id=id_) response_content = jsonrpc_dumps_pretty(error, id=id_, ledger=self.ledger)
self._set_headers(request, response_content) self._set_headers(request, response_content)
request.setResponseCode(200) request.setResponseCode(200)
self._render_message(request, response_content) self._render_message(request, response_content)
@ -324,7 +330,7 @@ class AuthJSONRPCServer(AuthorizedBase):
return self._render(request) return self._render(request)
except BaseException as e: except BaseException as e:
log.error(e) log.error(e)
error = JSONRPCError.create_from_exception(e, traceback=format_exc()) error = JSONRPCError.create_from_exception(str(e), traceback=format_exc())
self._render_error(error, request, None) self._render_error(error, request, None)
return server.NOT_DONE_YET return server.NOT_DONE_YET
@ -344,7 +350,6 @@ class AuthJSONRPCServer(AuthorizedBase):
def expire_session(): def expire_session():
self._unregister_user_session(session_id) self._unregister_user_session(session_id)
session.startCheckingExpiration()
session.notifyOnExpire(expire_session) session.notifyOnExpire(expire_session)
message = "OK" message = "OK"
request.setResponseCode(200) request.setResponseCode(200)
@ -355,12 +360,12 @@ class AuthJSONRPCServer(AuthorizedBase):
session.touch() session.touch()
request.content.seek(0, 0) request.content.seek(0, 0)
content = request.content.read() content = request.content.read().decode()
try: try:
parsed = jsonrpclib.loads(content) parsed = jsonrpclib.loads(content)
except ValueError: except json.JSONDecodeError:
log.warning("Unable to decode request json") log.warning("Unable to decode request json")
self._render_error(JSONRPCError(None, JSONRPCError.CODE_PARSE_ERROR), request, None) self._render_error(JSONRPCError(None, code=JSONRPCError.CODE_PARSE_ERROR), request, None)
return server.NOT_DONE_YET return server.NOT_DONE_YET
request_id = None request_id = None
@ -384,7 +389,8 @@ class AuthJSONRPCServer(AuthorizedBase):
log.warning("API validation failed") log.warning("API validation failed")
self._render_error( self._render_error(
JSONRPCError.create_from_exception( JSONRPCError.create_from_exception(
err, code=JSONRPCError.CODE_AUTHENTICATION_ERROR, str(err),
code=JSONRPCError.CODE_AUTHENTICATION_ERROR,
traceback=format_exc() traceback=format_exc()
), ),
request, request_id request, request_id
@ -399,12 +405,12 @@ class AuthJSONRPCServer(AuthorizedBase):
except UnknownAPIMethodError as err: except UnknownAPIMethodError as err:
log.warning('Failed to get function %s: %s', function_name, err) log.warning('Failed to get function %s: %s', function_name, err)
self._render_error( self._render_error(
JSONRPCError(None, JSONRPCError.CODE_METHOD_NOT_FOUND), JSONRPCError(None, code=JSONRPCError.CODE_METHOD_NOT_FOUND),
request, request_id request, request_id
) )
return server.NOT_DONE_YET return server.NOT_DONE_YET
if args == EMPTY_PARAMS or args == []: if args in (EMPTY_PARAMS, []):
_args, _kwargs = (), {} _args, _kwargs = (), {}
elif isinstance(args, dict): elif isinstance(args, dict):
_args, _kwargs = (), args _args, _kwargs = (), args
@ -510,7 +516,7 @@ class AuthJSONRPCServer(AuthorizedBase):
def _get_jsonrpc_method(self, function_path): def _get_jsonrpc_method(self, function_path):
if function_path in self.deprecated_methods: if function_path in self.deprecated_methods:
new_command = self.deprecated_methods[function_path]._new_command new_command = self.deprecated_methods[function_path].new_command
log.warning('API function \"%s\" is deprecated, please update to use \"%s\"', log.warning('API function \"%s\" is deprecated, please update to use \"%s\"',
function_path, new_command) function_path, new_command)
function_path = new_command function_path = new_command
@ -519,7 +525,7 @@ class AuthJSONRPCServer(AuthorizedBase):
@staticmethod @staticmethod
def _check_params(function, args_tup, args_dict): def _check_params(function, args_tup, args_dict):
argspec = inspect.getargspec(undecorated(function)) argspec = inspect.getfullargspec(undecorated(function))
num_optional_params = 0 if argspec.defaults is None else len(argspec.defaults) num_optional_params = 0 if argspec.defaults is None else len(argspec.defaults)
duplicate_params = [ duplicate_params = [
@ -539,7 +545,7 @@ class AuthJSONRPCServer(AuthorizedBase):
if len(missing_required_params): if len(missing_required_params):
return 'Missing required parameters', missing_required_params return 'Missing required parameters', missing_required_params
extraneous_params = [] if argspec.keywords is not None else [ extraneous_params = [] if argspec.varkw is not None else [
extra_param extra_param
for extra_param in args_dict for extra_param in args_dict
if extra_param not in argspec.args[1:] if extra_param not in argspec.args[1:]
@ -568,10 +574,10 @@ class AuthJSONRPCServer(AuthorizedBase):
def _callback_render(self, result, request, id_, auth_required=False): def _callback_render(self, result, request, id_, auth_required=False):
try: try:
encoded_message = jsonrpc_dumps_pretty(result, id=id_, default=default_decimal) message = jsonrpc_dumps_pretty(result, id=id_, ledger=self.ledger)
request.setResponseCode(200) request.setResponseCode(200)
self._set_headers(request, encoded_message, auth_required) self._set_headers(request, message, auth_required)
self._render_message(request, encoded_message) self._render_message(request, message)
except Exception as err: except Exception as err:
log.exception("Failed to render API response: %s", result) log.exception("Failed to render API response: %s", result)
self._render_error(err, request, id_) self._render_error(err, request, id_)

View file

@ -33,11 +33,11 @@ def undecorated(o):
except AttributeError: except AttributeError:
pass pass
# try: try:
# # python3 # python3
# closure = o.__closure__ closure = o.__closure__
# except AttributeError: except AttributeError:
# return return
if closure: if closure:
for cell in closure: for cell in closure:

View file

@ -9,21 +9,22 @@ import logging
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
API_KEY_NAME = "api" API_KEY_NAME = "api"
LBRY_SECRET = "LBRY_SECRET"
def sha(x): def sha(x: bytes) -> bytes:
h = hashlib.sha256(x).digest() h = hashlib.sha256(x).digest()
return base58.b58encode(h) return base58.b58encode(h)
def generate_key(x=None): def generate_key(x: bytes = None) -> bytes:
if x is None: if x is None:
return sha(os.urandom(256)) return sha(os.urandom(256))
else: else:
return sha(x) return sha(x)
class APIKey(object): class APIKey:
def __init__(self, secret, name, expiration=None): def __init__(self, secret, name, expiration=None):
self.secret = secret self.secret = secret
self.name = name self.name = name
@ -40,7 +41,7 @@ class APIKey(object):
def get_hmac(self, message): def get_hmac(self, message):
decoded_key = self._raw_key() decoded_key = self._raw_key()
signature = hmac.new(decoded_key, message, hashlib.sha256) signature = hmac.new(decoded_key, message.encode(), hashlib.sha256)
return base58.b58encode(signature.digest()) return base58.b58encode(signature.digest())
def compare_hmac(self, message, token): def compare_hmac(self, message, token):
@ -65,7 +66,7 @@ def load_api_keys(path):
keys_for_return = {} keys_for_return = {}
for key_name in data: for key_name in data:
key = data[key_name] key = data[key_name]
secret = key['secret'] secret = key['secret'].decode()
expiration = key['expiration'] expiration = key['expiration']
keys_for_return.update({key_name: APIKey(secret, key_name, expiration)}) keys_for_return.update({key_name: APIKey(secret, key_name, expiration)})
return keys_for_return return keys_for_return

View file

@ -0,0 +1,46 @@
from decimal import Decimal
from binascii import hexlify
from datetime import datetime
from json import JSONEncoder
from lbrynet.wallet.transaction import Transaction, Output
class JSONResponseEncoder(JSONEncoder):
def __init__(self, *args, ledger, **kwargs):
super().__init__(*args, **kwargs)
self.ledger = ledger
def default(self, obj): # pylint: disable=method-hidden
if isinstance(obj, Transaction):
return self.encode_transaction(obj)
if isinstance(obj, Output):
return self.encode_output(obj)
if isinstance(obj, datetime):
return obj.strftime("%Y%m%dT%H:%M:%S")
if isinstance(obj, Decimal):
return float(obj)
if isinstance(obj, bytes):
return obj.decode()
return super().default(obj)
def encode_transaction(self, tx):
return {
'txid': tx.id,
'inputs': [self.encode_input(txo) for txo in tx.inputs],
'outputs': [self.encode_output(txo) for txo in tx.outputs],
'total_input': tx.input_sum,
'total_output': tx.input_sum - tx.fee,
'total_fee': tx.fee,
'hex': hexlify(tx.raw).decode(),
}
def encode_output(self, txo):
return {
'nout': txo.position,
'amount': txo.amount,
'address': txo.get_address(self.ledger)
}
def encode_input(self, txi):
return self.encode_output(txi.txo_ref.txo)

View file

@ -39,7 +39,7 @@ def migrate_blobs_db(db_dir):
blobs_db_cursor.execute( blobs_db_cursor.execute(
"ALTER TABLE blobs ADD COLUMN should_announce integer NOT NULL DEFAULT 0") "ALTER TABLE blobs ADD COLUMN should_announce integer NOT NULL DEFAULT 0")
else: else:
log.warn("should_announce already exists somehow, proceeding anyways") log.warning("should_announce already exists somehow, proceeding anyways")
# if lbryfile_info.db doesn't exist, skip marking blobs as should_announce = True # if lbryfile_info.db doesn't exist, skip marking blobs as should_announce = True
if not os.path.isfile(lbryfile_info_db): if not os.path.isfile(lbryfile_info_db):
@ -83,4 +83,3 @@ def migrate_blobs_db(db_dir):
blobs_db_file.commit() blobs_db_file.commit()
blobs_db_file.close() blobs_db_file.close()
lbryfile_info_file.close() lbryfile_info_file.close()

View file

@ -247,7 +247,7 @@ def do_migration(db_dir):
claim_queries = {} # <sd_hash>: claim query tuple claim_queries = {} # <sd_hash>: claim query tuple
# get the claim queries ready, only keep those with associated files # get the claim queries ready, only keep those with associated files
for outpoint, sd_hash in file_outpoints.iteritems(): for outpoint, sd_hash in file_outpoints.items():
if outpoint in claim_outpoint_queries: if outpoint in claim_outpoint_queries:
claim_queries[sd_hash] = claim_outpoint_queries[outpoint] claim_queries[sd_hash] = claim_outpoint_queries[outpoint]
@ -260,7 +260,7 @@ def do_migration(db_dir):
claim_arg_tup[7], claim_arg_tup[6], claim_arg_tup[8], claim_arg_tup[7], claim_arg_tup[6], claim_arg_tup[8],
smart_decode(claim_arg_tup[8]).certificate_id, claim_arg_tup[5], claim_arg_tup[4] smart_decode(claim_arg_tup[8]).certificate_id, claim_arg_tup[5], claim_arg_tup[4]
) )
for sd_hash, claim_arg_tup in claim_queries.iteritems() if claim_arg_tup for sd_hash, claim_arg_tup in claim_queries.items() if claim_arg_tup
] # sd_hash, (txid, nout, claim_id, name, sequence, address, height, amount, serialized) ] # sd_hash, (txid, nout, claim_id, name, sequence, address, height, amount, serialized)
) )
@ -268,7 +268,7 @@ def do_migration(db_dir):
damaged_stream_sds = [] damaged_stream_sds = []
# import the files and get sd hashes of streams to attempt recovering # import the files and get sd hashes of streams to attempt recovering
for sd_hash, file_query in file_args.iteritems(): for sd_hash, file_query in file_args.items():
failed_sd = _import_file(*file_query) failed_sd = _import_file(*file_query)
if failed_sd: if failed_sd:
damaged_stream_sds.append(failed_sd) damaged_stream_sds.append(failed_sd)

View file

@ -2,6 +2,7 @@ import logging
import os import os
import sqlite3 import sqlite3
import traceback import traceback
from binascii import hexlify, unhexlify
from decimal import Decimal from decimal import Decimal
from twisted.internet import defer, task, threads from twisted.internet import defer, task, threads
from twisted.enterprise import adbapi from twisted.enterprise import adbapi
@ -11,7 +12,8 @@ from lbryschema.decode import smart_decode
from lbrynet import conf from lbrynet import conf
from lbrynet.cryptstream.CryptBlob import CryptBlobInfo from lbrynet.cryptstream.CryptBlob import CryptBlobInfo
from lbrynet.dht.constants import dataExpireTimeout from lbrynet.dht.constants import dataExpireTimeout
from lbryum.constants import COIN from lbrynet.wallet.database import WalletDatabase
from torba.constants import COIN
log = logging.getLogger(__name__) log = logging.getLogger(__name__)
@ -83,18 +85,19 @@ def rerun_if_locked(f):
class SqliteConnection(adbapi.ConnectionPool): class SqliteConnection(adbapi.ConnectionPool):
def __init__(self, db_path): def __init__(self, db_path):
adbapi.ConnectionPool.__init__(self, 'sqlite3', db_path, check_same_thread=False) super().__init__('sqlite3', db_path, check_same_thread=False)
@rerun_if_locked @rerun_if_locked
def runInteraction(self, interaction, *args, **kw): def runInteraction(self, interaction, *args, **kw):
return adbapi.ConnectionPool.runInteraction(self, interaction, *args, **kw) return super().runInteraction(interaction, *args, **kw)
@classmethod @classmethod
def set_reactor(cls, reactor): def set_reactor(cls, reactor):
cls.reactor = reactor cls.reactor = reactor
class SQLiteStorage(object): class SQLiteStorage:
CREATE_TABLES_QUERY = """ CREATE_TABLES_QUERY = """
pragma foreign_keys=on; pragma foreign_keys=on;
pragma journal_mode=WAL; pragma journal_mode=WAL;
@ -164,7 +167,7 @@ class SQLiteStorage(object):
timestamp integer, timestamp integer,
primary key (sd_hash, reflector_address) primary key (sd_hash, reflector_address)
); );
""" """ + WalletDatabase.CREATE_TABLES_QUERY
def __init__(self, db_dir, reactor=None): def __init__(self, db_dir, reactor=None):
if not reactor: if not reactor:
@ -209,6 +212,12 @@ class SQLiteStorage(object):
else: else:
defer.returnValue([]) defer.returnValue([])
def run_and_return_id(self, query, *args):
def do_save(t):
t.execute(query, args)
return t.lastrowid
return self.db.runInteraction(do_save)
def stop(self): def stop(self):
if self.check_should_announce_lc and self.check_should_announce_lc.running: if self.check_should_announce_lc and self.check_should_announce_lc.running:
self.check_should_announce_lc.stop() self.check_should_announce_lc.stop()
@ -259,7 +268,7 @@ class SQLiteStorage(object):
blob_hashes = yield self.run_and_return_list( blob_hashes = yield self.run_and_return_list(
"select blob_hash from blob where status='finished'" "select blob_hash from blob where status='finished'"
) )
defer.returnValue([blob_hash.decode('hex') for blob_hash in blob_hashes]) defer.returnValue([unhexlify(blob_hash) for blob_hash in blob_hashes])
def count_finished_blobs(self): def count_finished_blobs(self):
return self.run_and_return_one_or_none( return self.run_and_return_one_or_none(
@ -483,21 +492,17 @@ class SQLiteStorage(object):
@defer.inlineCallbacks @defer.inlineCallbacks
def save_downloaded_file(self, stream_hash, file_name, download_directory, data_payment_rate): def save_downloaded_file(self, stream_hash, file_name, download_directory, data_payment_rate):
# touch the closest available file to the file name # touch the closest available file to the file name
file_name = yield open_file_for_writing(download_directory.decode('hex'), file_name.decode('hex')) file_name = yield open_file_for_writing(unhexlify(download_directory).decode(), unhexlify(file_name).decode())
result = yield self.save_published_file( result = yield self.save_published_file(
stream_hash, file_name.encode('hex'), download_directory, data_payment_rate stream_hash, hexlify(file_name.encode()), download_directory, data_payment_rate
) )
defer.returnValue(result) defer.returnValue(result)
def save_published_file(self, stream_hash, file_name, download_directory, data_payment_rate, status="stopped"): def save_published_file(self, stream_hash, file_name, download_directory, data_payment_rate, status="stopped"):
def do_save(db_transaction): return self.run_and_return_id(
db_transaction.execute( "insert into file values (?, ?, ?, ?, ?)",
"insert into file values (?, ?, ?, ?, ?)", stream_hash, file_name, download_directory, data_payment_rate, status
(stream_hash, file_name, download_directory, data_payment_rate, status) )
)
file_rowid = db_transaction.lastrowid
return file_rowid
return self.db.runInteraction(do_save)
def get_filename_for_rowid(self, rowid): def get_filename_for_rowid(self, rowid):
return self.run_and_return_one_or_none("select file_name from file where rowid=?", rowid) return self.run_and_return_one_or_none("select file_name from file where rowid=?", rowid)
@ -609,7 +614,7 @@ class SQLiteStorage(object):
source_hash = None source_hash = None
except AttributeError: except AttributeError:
source_hash = None source_hash = None
serialized = claim_info.get('hex') or smart_decode(claim_info['value']).serialized.encode('hex') serialized = claim_info.get('hex') or hexlify(smart_decode(claim_info['value']).serialized)
transaction.execute( transaction.execute(
"insert or replace into claim values (?, ?, ?, ?, ?, ?, ?, ?, ?)", "insert or replace into claim values (?, ?, ?, ?, ?, ?, ?, ?, ?)",
(outpoint, claim_id, name, amount, height, serialized, certificate_id, address, sequence) (outpoint, claim_id, name, amount, height, serialized, certificate_id, address, sequence)
@ -651,6 +656,19 @@ class SQLiteStorage(object):
if support_dl: if support_dl:
yield defer.DeferredList(support_dl) yield defer.DeferredList(support_dl)
def save_claims_for_resolve(self, claim_infos):
to_save = []
for info in claim_infos:
if 'value' in info:
if info['value']:
to_save.append(info)
else:
if 'certificate' in info and info['certificate']['value']:
to_save.append(info['certificate'])
if 'claim' in info and info['claim']['value']:
to_save.append(info['claim'])
return self.save_claims(to_save)
def get_old_stream_hashes_for_claim_id(self, claim_id, new_stream_hash): def get_old_stream_hashes_for_claim_id(self, claim_id, new_stream_hash):
return self.run_and_return_list( return self.run_and_return_list(
"select f.stream_hash from file f " "select f.stream_hash from file f "
@ -667,7 +685,7 @@ class SQLiteStorage(object):
).fetchone() ).fetchone()
if not claim_info: if not claim_info:
raise Exception("claim not found") raise Exception("claim not found")
new_claim_id, claim = claim_info[0], ClaimDict.deserialize(claim_info[1].decode('hex')) new_claim_id, claim = claim_info[0], ClaimDict.deserialize(unhexlify(claim_info[1]))
# certificate claims should not be in the content_claim table # certificate claims should not be in the content_claim table
if not claim.is_stream: if not claim.is_stream:
@ -680,7 +698,7 @@ class SQLiteStorage(object):
if not known_sd_hash: if not known_sd_hash:
raise Exception("stream not found") raise Exception("stream not found")
# check the claim contains the same sd hash # check the claim contains the same sd hash
if known_sd_hash[0] != claim.source_hash: if known_sd_hash[0].encode() != claim.source_hash:
raise Exception("stream mismatch") raise Exception("stream mismatch")
# if there is a current claim associated to the file, check that the new claim is an update to it # if there is a current claim associated to the file, check that the new claim is an update to it
@ -828,7 +846,7 @@ class SQLiteStorage(object):
def save_claim_tx_heights(self, claim_tx_heights): def save_claim_tx_heights(self, claim_tx_heights):
def _save_claim_heights(transaction): def _save_claim_heights(transaction):
for outpoint, height in claim_tx_heights.iteritems(): for outpoint, height in claim_tx_heights.items():
transaction.execute( transaction.execute(
"update claim set height=? where claim_outpoint=? and height=-1", "update claim set height=? where claim_outpoint=? and height=-1",
(height, outpoint) (height, outpoint)
@ -864,7 +882,7 @@ def _format_claim_response(outpoint, claim_id, name, amount, height, serialized,
"claim_id": claim_id, "claim_id": claim_id,
"address": address, "address": address,
"claim_sequence": claim_sequence, "claim_sequence": claim_sequence,
"value": ClaimDict.deserialize(serialized.decode('hex')).claim_dict, "value": ClaimDict.deserialize(unhexlify(serialized)).claim_dict,
"height": height, "height": height,
"amount": float(Decimal(amount) / Decimal(COIN)), "amount": float(Decimal(amount) / Decimal(COIN)),
"nout": int(outpoint.split(":")[1]), "nout": int(outpoint.split(":")[1]),

View file

@ -1,16 +1,18 @@
import ipaddress import ipaddress
from binascii import hexlify
from functools import reduce
from lbrynet.dht import constants from lbrynet.dht import constants
def is_valid_ipv4(address): def is_valid_ipv4(address):
try: try:
ip = ipaddress.ip_address(address.decode()) # this needs to be unicode, thus the decode() ip = ipaddress.ip_address(address)
return ip.version == 4 return ip.version == 4
except ipaddress.AddressValueError: except ipaddress.AddressValueError:
return False return False
class _Contact(object): class _Contact:
""" Encapsulation for remote contact """ Encapsulation for remote contact
This class contains information on a single remote contact, and also This class contains information on a single remote contact, and also
@ -19,8 +21,8 @@ class _Contact(object):
def __init__(self, contactManager, id, ipAddress, udpPort, networkProtocol, firstComm): def __init__(self, contactManager, id, ipAddress, udpPort, networkProtocol, firstComm):
if id is not None: if id is not None:
if not len(id) == constants.key_bits / 8: if not len(id) == constants.key_bits // 8:
raise ValueError("invalid node id: %s" % id.encode('hex')) raise ValueError("invalid node id: {}".format(hexlify(id).decode()))
if not 0 <= udpPort <= 65536: if not 0 <= udpPort <= 65536:
raise ValueError("invalid port") raise ValueError("invalid port")
if not is_valid_ipv4(ipAddress): if not is_valid_ipv4(ipAddress):
@ -56,7 +58,7 @@ class _Contact(object):
def log_id(self, short=True): def log_id(self, short=True):
if not self.id: if not self.id:
return "not initialized" return "not initialized"
id_hex = self.id.encode('hex') id_hex = hexlify(self.id)
return id_hex if not short else id_hex[:8] return id_hex if not short else id_hex[:8]
@property @property
@ -95,25 +97,17 @@ class _Contact(object):
return None return None
def __eq__(self, other): def __eq__(self, other):
if isinstance(other, _Contact): if not isinstance(other, _Contact):
return self.id == other.id raise TypeError("invalid type to compare with Contact: %s" % str(type(other)))
elif isinstance(other, str): return (self.id, self.address, self.port) == (other.id, other.address, other.port)
return self.id == other
else:
return False
def __ne__(self, other): def __hash__(self):
if isinstance(other, _Contact): return hash((self.id, self.address, self.port))
return self.id != other.id
elif isinstance(other, str):
return self.id != other
else:
return True
def compact_ip(self): def compact_ip(self):
compact_ip = reduce( compact_ip = reduce(
lambda buff, x: buff + bytearray([int(x)]), self.address.split('.'), bytearray()) lambda buff, x: buff + bytearray([int(x)]), self.address.split('.'), bytearray())
return str(compact_ip) return compact_ip
def set_id(self, id): def set_id(self, id):
if not self._id: if not self._id:
@ -156,12 +150,12 @@ class _Contact(object):
raise AttributeError("unknown command: %s" % name) raise AttributeError("unknown command: %s" % name)
def _sendRPC(*args, **kwargs): def _sendRPC(*args, **kwargs):
return self._networkProtocol.sendRPC(self, name, args) return self._networkProtocol.sendRPC(self, name.encode(), args)
return _sendRPC return _sendRPC
class ContactManager(object): class ContactManager:
def __init__(self, get_time=None): def __init__(self, get_time=None):
if not get_time: if not get_time:
from twisted.internet import reactor from twisted.internet import reactor
@ -171,12 +165,11 @@ class ContactManager(object):
self._rpc_failures = {} self._rpc_failures = {}
def get_contact(self, id, address, port): def get_contact(self, id, address, port):
for contact in self._contacts.itervalues(): for contact in self._contacts.values():
if contact.id == id and contact.address == address and contact.port == port: if contact.id == id and contact.address == address and contact.port == port:
return contact return contact
def make_contact(self, id, ipAddress, udpPort, networkProtocol, firstComm=0): def make_contact(self, id, ipAddress, udpPort, networkProtocol, firstComm=0):
ipAddress = str(ipAddress)
contact = self.get_contact(id, ipAddress, udpPort) contact = self.get_contact(id, ipAddress, udpPort)
if contact: if contact:
return contact return contact

View file

@ -1,27 +1,21 @@
import UserDict from collections import UserDict
import constants from . import constants
from interface import IDataStore
from zope.interface import implements
class DictDataStore(UserDict.DictMixin): class DictDataStore(UserDict):
""" A datastore using an in-memory Python dictionary """ """ A datastore using an in-memory Python dictionary """
implements(IDataStore) #implements(IDataStore)
def __init__(self, getTime=None): def __init__(self, getTime=None):
# Dictionary format: # Dictionary format:
# { <key>: (<contact>, <value>, <lastPublished>, <originallyPublished> <originalPublisherID>) } # { <key>: (<contact>, <value>, <lastPublished>, <originallyPublished> <originalPublisherID>) }
self._dict = {} super().__init__()
if not getTime: if not getTime:
from twisted.internet import reactor from twisted.internet import reactor
getTime = reactor.seconds getTime = reactor.seconds
self._getTime = getTime self._getTime = getTime
self.completed_blobs = set() self.completed_blobs = set()
def keys(self):
""" Return a list of the keys in this data store """
return self._dict.keys()
def filter_bad_and_expired_peers(self, key): def filter_bad_and_expired_peers(self, key):
""" """
Returns only non-expired and unknown/good peers Returns only non-expired and unknown/good peers
@ -29,41 +23,44 @@ class DictDataStore(UserDict.DictMixin):
return filter( return filter(
lambda peer: lambda peer:
self._getTime() - peer[3] < constants.dataExpireTimeout and peer[0].contact_is_good is not False, self._getTime() - peer[3] < constants.dataExpireTimeout and peer[0].contact_is_good is not False,
self._dict[key] self[key]
) )
def filter_expired_peers(self, key): def filter_expired_peers(self, key):
""" """
Returns only non-expired peers Returns only non-expired peers
""" """
return filter(lambda peer: self._getTime() - peer[3] < constants.dataExpireTimeout, self._dict[key]) return filter(lambda peer: self._getTime() - peer[3] < constants.dataExpireTimeout, self[key])
def removeExpiredPeers(self): def removeExpiredPeers(self):
for key in self._dict.keys(): expired_keys = []
unexpired_peers = self.filter_expired_peers(key) for key in self.keys():
unexpired_peers = list(self.filter_expired_peers(key))
if not unexpired_peers: if not unexpired_peers:
del self._dict[key] expired_keys.append(key)
else: else:
self._dict[key] = unexpired_peers self[key] = unexpired_peers
for key in expired_keys:
del self[key]
def hasPeersForBlob(self, key): def hasPeersForBlob(self, key):
return True if key in self._dict and len(self.filter_bad_and_expired_peers(key)) else False return True if key in self and len(tuple(self.filter_bad_and_expired_peers(key))) else False
def addPeerToBlob(self, contact, key, compact_address, lastPublished, originallyPublished, originalPublisherID): def addPeerToBlob(self, contact, key, compact_address, lastPublished, originallyPublished, originalPublisherID):
if key in self._dict: if key in self:
if compact_address not in map(lambda store_tuple: store_tuple[1], self._dict[key]): if compact_address not in map(lambda store_tuple: store_tuple[1], self[key]):
self._dict[key].append( self[key].append(
(contact, compact_address, lastPublished, originallyPublished, originalPublisherID) (contact, compact_address, lastPublished, originallyPublished, originalPublisherID)
) )
else: else:
self._dict[key] = [(contact, compact_address, lastPublished, originallyPublished, originalPublisherID)] self[key] = [(contact, compact_address, lastPublished, originallyPublished, originalPublisherID)]
def getPeersForBlob(self, key): def getPeersForBlob(self, key):
return [] if key not in self._dict else [val[1] for val in self.filter_bad_and_expired_peers(key)] return [] if key not in self else [val[1] for val in self.filter_bad_and_expired_peers(key)]
def getStoringContacts(self): def getStoringContacts(self):
contacts = set() contacts = set()
for key in self._dict: for key in self:
for values in self._dict[key]: for values in self[key]:
contacts.add(values[0]) contacts.add(values[0])
return list(contacts) return list(contacts)

View file

@ -1,21 +1,21 @@
from lbrynet.dht import constants from lbrynet.dht import constants
class Distance(object): class Distance:
"""Calculate the XOR result between two string variables. """Calculate the XOR result between two string variables.
Frequently we re-use one of the points so as an optimization Frequently we re-use one of the points so as an optimization
we pre-calculate the long value of that point. we pre-calculate the value of that point.
""" """
def __init__(self, key): def __init__(self, key):
if len(key) != constants.key_bits / 8: if len(key) != constants.key_bits // 8:
raise ValueError("invalid key length: %i" % len(key)) raise ValueError("invalid key length: %i" % len(key))
self.key = key self.key = key
self.val_key_one = long(key.encode('hex'), 16) self.val_key_one = int.from_bytes(key, 'big')
def __call__(self, key_two): def __call__(self, key_two):
val_key_two = long(key_two.encode('hex'), 16) val_key_two = int.from_bytes(key_two, 'big')
return self.val_key_one ^ val_key_two return self.val_key_one ^ val_key_two
def is_closer(self, a, b): def is_closer(self, a, b):

View file

@ -1,134 +1,75 @@
from error import DecodeError from .error import DecodeError
class Encoding(object): def bencode(data):
""" Interface for RPC message encoders/decoders """ Encoder implementation of the Bencode algorithm (Bittorrent). """
if isinstance(data, int):
All encoding implementations used with this library should inherit and return b'i%de' % data
implement this. elif isinstance(data, (bytes, bytearray)):
""" return b'%d:%s' % (len(data), data)
elif isinstance(data, str):
def encode(self, data): return b'%d:%s' % (len(data), data.encode())
""" Encode the specified data elif isinstance(data, (list, tuple)):
encoded_list_items = b''
@param data: The data to encode for item in data:
This method has to support encoding of the following encoded_list_items += bencode(item)
types: C{str}, C{int} and C{long} return b'l%se' % encoded_list_items
Any additional data types may be supported as long as the elif isinstance(data, dict):
implementing class's C{decode()} method can successfully encoded_dict_items = b''
decode them. keys = data.keys()
for key in sorted(keys):
@return: The encoded data encoded_dict_items += bencode(key)
@rtype: str encoded_dict_items += bencode(data[key])
""" return b'd%se' % encoded_dict_items
else:
def decode(self, data): raise TypeError("Cannot bencode '%s' object" % type(data))
""" Decode the specified data string
@param data: The data (byte string) to decode.
@type data: str
@return: The decoded data (in its correct type)
"""
class Bencode(Encoding): def bdecode(data):
""" Implementation of a Bencode-based algorithm (Bencode is the encoding """ Decoder implementation of the Bencode algorithm. """
algorithm used by Bittorrent). assert type(data) == bytes # fixme: _maybe_ remove this after porting
if len(data) == 0:
raise DecodeError('Cannot decode empty string')
try:
return _decode_recursive(data)[0]
except ValueError as e:
raise DecodeError(str(e))
@note: This algorithm differs from the "official" Bencode algorithm in
that it can encode/decode floating point values in addition to
integers.
"""
def encode(self, data): def _decode_recursive(data, start_index=0):
""" Encoder implementation of the Bencode algorithm if data[start_index] == ord('i'):
end_pos = data[start_index:].find(b'e') + start_index
@param data: The data to encode return int(data[start_index + 1:end_pos]), end_pos + 1
@type data: int, long, tuple, list, dict or str elif data[start_index] == ord('l'):
start_index += 1
@return: The encoded data decoded_list = []
@rtype: str while data[start_index] != ord('e'):
""" list_data, start_index = _decode_recursive(data, start_index)
if isinstance(data, (int, long)): decoded_list.append(list_data)
return 'i%de' % data return decoded_list, start_index + 1
elif isinstance(data, str): elif data[start_index] == ord('d'):
return '%d:%s' % (len(data), data) start_index += 1
elif isinstance(data, (list, tuple)): decoded_dict = {}
encodedListItems = '' while data[start_index] != ord('e'):
for item in data: key, start_index = _decode_recursive(data, start_index)
encodedListItems += self.encode(item) value, start_index = _decode_recursive(data, start_index)
return 'l%se' % encodedListItems decoded_dict[key] = value
elif isinstance(data, dict): return decoded_dict, start_index
encodedDictItems = '' elif data[start_index] == ord('f'):
keys = data.keys() # This (float data type) is a non-standard extension to the original Bencode algorithm
keys.sort() end_pos = data[start_index:].find(b'e') + start_index
for key in keys: return float(data[start_index + 1:end_pos]), end_pos + 1
encodedDictItems += self.encode(key) # TODO: keys should always be bytestrings elif data[start_index] == ord('n'):
encodedDictItems += self.encode(data[key]) # This (None/NULL data type) is a non-standard extension
return 'd%se' % encodedDictItems # to the original Bencode algorithm
else: return None, start_index + 1
print data else:
raise TypeError("Cannot bencode '%s' object" % type(data)) split_pos = data[start_index:].find(b':') + start_index
def decode(self, data):
""" Decoder implementation of the Bencode algorithm
@param data: The encoded data
@type data: str
@note: This is a convenience wrapper for the recursive decoding
algorithm, C{_decodeRecursive}
@return: The decoded data, as a native Python type
@rtype: int, list, dict or str
"""
if len(data) == 0:
raise DecodeError('Cannot decode empty string')
try: try:
return self._decodeRecursive(data)[0] length = int(data[start_index:split_pos])
except ValueError as e: except ValueError:
raise DecodeError(e.message) raise DecodeError()
start_index = split_pos + 1
@staticmethod end_pos = start_index + length
def _decodeRecursive(data, startIndex=0): b = data[start_index:end_pos]
""" Actual implementation of the recursive Bencode algorithm return b, end_pos
Do not call this; use C{decode()} instead
"""
if data[startIndex] == 'i':
endPos = data[startIndex:].find('e') + startIndex
return int(data[startIndex + 1:endPos]), endPos + 1
elif data[startIndex] == 'l':
startIndex += 1
decodedList = []
while data[startIndex] != 'e':
listData, startIndex = Bencode._decodeRecursive(data, startIndex)
decodedList.append(listData)
return decodedList, startIndex + 1
elif data[startIndex] == 'd':
startIndex += 1
decodedDict = {}
while data[startIndex] != 'e':
key, startIndex = Bencode._decodeRecursive(data, startIndex)
value, startIndex = Bencode._decodeRecursive(data, startIndex)
decodedDict[key] = value
return decodedDict, startIndex
elif data[startIndex] == 'f':
# This (float data type) is a non-standard extension to the original Bencode algorithm
endPos = data[startIndex:].find('e') + startIndex
return float(data[startIndex + 1:endPos]), endPos + 1
elif data[startIndex] == 'n':
# This (None/NULL data type) is a non-standard extension
# to the original Bencode algorithm
return None, startIndex + 1
else:
splitPos = data[startIndex:].find(':') + startIndex
try:
length = int(data[startIndex:splitPos])
except ValueError, e:
raise DecodeError, e
startIndex = splitPos + 1
endPos = startIndex + length
bytes = data[startIndex:endPos]
return bytes, endPos

Some files were not shown because too many files have changed in this diff Show more