V2EX = way to explore
V2EX 是一个关于分享和探索的地方
Sign Up Now
For Existing Member  Sign In
推荐学习书目
Learn Python the Hard Way
Python Sites
PyPI - Python Package Index
http://diveintopython.org/toc/index.html
Pocoo
值得关注的项目
PyPy
Celery
Jinja2
Read the Docs
gevent
pyenv
virtualenv
Stackless Python
Beautiful Soup
结巴中文分词
Green Unicorn
Sentry
Shovel
Pyflakes
pytest
Python 编程
pep8 Checker
Styles
PEP 8
Google Python Style Guide
Code Style from The Hitchhiker's Guide
mlzboy
V2EX  ›  Python

python大家用的版本是什么,why?

  •  
  •   mlzboy · Dec 7, 2011 · 6325 views
    This topic created in 5261 days ago, the information mentioned may be changed or developed.
    RT
    19 replies    1970-01-01 08:00:00 +08:00
    clino
        1
    clino  
       Dec 7, 2011
    基本上是 2.6.x,因为 ubuntu 系统预装了,而且比较主流
    有台windows服务器因为依赖的模块只有 2.5.x 的,只好装 2.5 的了
    xiaket
        2
    xiaket  
       Dec 7, 2011
    2.7, 开始往3靠
    9hills
        3
    9hills  
       Dec 7, 2011
    激进的arch。。不过好在有virtualenv
    [cynic@9hills:~]
    % python --version
    Python 3.2.2
    [cynic@9hills:~]
    % python2 --version
    Python 2.7.2
    tiantian20007
        4
    tiantian20007  
       Dec 7, 2011
    可以试试pypy.org
    mailing list原文

    Summary: Pypy 1.7 is almost twice as fast as cpython 2.7 on the
    tornado HTTPServer benchmark, and three times as fast on the
    template-rendering benchmark. Pypy used about three times as much
    memory in both cases, but usage was stable over time (i.e. it's not
    "leaking" like pypy 1.6 did).

    Details: Tested on an "m1.large" EC2 instance running ubuntu 10.04
    and the 64-bit binary releases of pypy (and cpython 2.7 from
    https://launchpad.net/~fkrull/+archive/deadsnakes). Large instances
    have two cores so the ab client wasn't competing for cpu cycles with
    the python process.

    ubuntu@ip-10-114-147-85:~/tornado$ /usr/bin/time python2.6
    demos/benchmark/benchmark.py --quiet --num_runs=5|grep "Requests per
    second"
    Requests per second: 2010.39 [#/sec] (mean)
    Requests per second: 2010.70 [#/sec] (mean)
    Requests per second: 2000.56 [#/sec] (mean)
    Requests per second: 2007.37 [#/sec] (mean)
    Requests per second: 1983.27 [#/sec] (mean)
    34.19user 7.29system 0:37.75elapsed 109%CPU (0avgtext+0avgdata
    59040maxresident)k
    0inputs+0outputs (0major+22404minor)pagefaults 0swaps

    ubuntu@ip-10-114-147-85:~/tornado$ /usr/bin/time python2.7
    demos/benchmark/benchmark.py --quiet --num_runs=5|grep "Requests per
    second"
    Requests per second: 2127.01 [#/sec] (mean)
    Requests per second: 2122.64 [#/sec] (mean)
    Requests per second: 2219.56 [#/sec] (mean)
    Requests per second: 2209.65 [#/sec] (mean)
    Requests per second: 2204.07 [#/sec] (mean)
    30.62user 6.76system 0:34.83elapsed 107%CPU (0avgtext+0avgdata
    60480maxresident)k
    0inputs+8outputs (1major+20898minor)pagefaults 0swaps

    ubuntu@ip-10-114-147-85:~/tornado$ /usr/bin/time ~/pypy-1.6/bin/pypy
    demos/benchmark/benchmark.py --quiet --num_runs=5|grep "Requests per
    second"
    Requests per second: 2436.49 [#/sec] (mean)
    Requests per second: 3289.71 [#/sec] (mean)
    Requests per second: 3469.36 [#/sec] (mean)
    Requests per second: 3453.05 [#/sec] (mean)
    Requests per second: 3373.15 [#/sec] (mean)
    22.06user 6.76system 0:25.76elapsed 111%CPU (0avgtext+0avgdata
    328208maxresident)k
    0inputs+3328outputs (0major+89216minor)pagefaults 0swaps

    ubuntu@ip-10-114-147-85:~/tornado$ /usr/bin/time ~/pypy-1.7/bin/pypy
    demos/benchmark/benchmark.py --quiet --num_runs=5|grep "Requests per
    second"
    Requests per second: 2673.48 [#/sec] (mean)
    Requests per second: 4410.94 [#/sec] (mean)
    Requests per second: 4155.42 [#/sec] (mean)
    Requests per second: 4164.85 [#/sec] (mean)
    Requests per second: 4626.81 [#/sec] (mean)
    17.15user 6.95system 0:21.08elapsed 114%CPU (0avgtext+0avgdata
    186576maxresident)k
    0inputs+2680outputs (0major+59250minor)pagefaults 0swaps

    ubuntu@ip-10-114-147-85:~/tornado$ /usr/bin/time python2.7
    demos/benchmark/template_benchmark.py --num=1000
    38.003 ms per iteration
    37.56user 0.54system 0:38.14elapsed 99%CPU (0avgtext+0avgdata 46672maxresident)k
    0inputs+136outputs (0major+119479minor)pagefaults 0swaps

    ubuntu@ip-10-114-147-85:~/tornado$ /usr/bin/time ~/pypy-1.7/bin/pypy
    demos/benchmark/template_benchmark.py --num=1000
    12.899 ms per iteration
    13.08user 0.03system 0:13.11elapsed 99%CPU (0avgtext+0avgdata
    153296maxresident)k
    0inputs+144outputs (0major+14267minor)pagefaults 0swaps
    likuku
        5
    likuku  
       Dec 7, 2011
    gentoo 2.7&3,macosx 2.5
    freetstar
        6
    freetstar  
       Dec 7, 2011
    archlinux py2.7
    mlzboy
        7
    mlzboy  
    OP
       Dec 8, 2011
    @9hills 这个连python的版本也能管理?我一直以为只能管理包的版本
    mathgl
        8
    mathgl  
       Dec 8, 2011
    py 2.6.6...
    9hills
        9
    9hills  
       Dec 9, 2011
    @mlzboy virtualenv -p /usr/bin/python2.6
    -p 是 --python 故也可以用 --python=python2.5 这种格式

    p.s. virtualenv + pip 乃是王道~~ 远程部署再加上 Fabric 就完美了
    epic2005
        10
    epic2005  
       Dec 20, 2011
    2.7
    empilot
        11
    empilot  
       Dec 20, 2011
    2.7
    CMGS
        12
    CMGS  
       Dec 20, 2011
    2.7
    zhouyang
        13
    zhouyang  
       Dec 20, 2011
    2.7 on lion
    windhunter
        14
    windhunter  
       Dec 20, 2011
    2.7 on ubuntu & win7
    是否最新并不是最重要的,stable第一。
    oldgun
        15
    oldgun  
       Dec 20, 2011
    从2.5出来用到现在没换过
    不需要换
    kojp
        16
    kojp  
       Dec 20, 2011
    WIN 下,2.7 & 2.6 貌似RHEL上自带的是2.4
    alexzhan
        17
    alexzhan  
       Dec 21, 2011
    2.7 马上转到3
    fanzeyi
        18
    fanzeyi  
       Dec 21, 2011
    2.7
    reus
        19
    reus  
       Dec 21, 2011 via Android
    2.7
    About   ·   Help   ·   Advertise   ·   Blog   ·   API   ·   FAQ   ·   Solana   ·   1407 Online   Highest 6679   ·     Select Language
    创意工作者们的社区
    World is powered by solitude
    VERSION: 3.9.8.5 · 67ms · UTC 16:27 · PVG 00:27 · LAX 09:27 · JFK 12:27
    ♥ Do have faith in what you're doing.