Rectangle 27 63

You're confusing form-encoded and JSON data here. request.POST['foo'] is for form-encoded data. You are posting raw JSON, so you should use request.body.

received_json_data=json.loads(request.body)
data = request.body.decode('utf-8') received_json_data = json.loads(data)

python - How to receive json data using HTTP POST request in Django 1....

python django django-views http-post python-requests
Rectangle 27 35

received_json_data = json.loads(request.body.decode("utf-8"))

python - How to receive json data using HTTP POST request in Django 1....

python django django-views http-post python-requests
Rectangle 27 12

If the data that you are receiving is, in fact, encoded in UTF-8, then it should be a sequence of bytes -- a Python 'str' object, in Python 2.X

You can verify this with an assertion:

assert isinstance(content, str)

Once you know that that's true, you can move to the actual encoding. Python doesn't do transcoding -- directly from UTF-8 to ASCII, for instance. You need to first turn your sequence of bytes into a Unicode string, by decoding it:

unicode_content = content.decode('utf-8')

(If you can trust parsed_feed.encoding, then use that instead of the literal 'utf-8'. Either way, be prepared for errors.)

You can then take that string, and encode it in ASCII, substituting high characters with their XML entity equivalents:

xml_content = unicode_content.encode('ascii', 'xmlcharrefreplace')

The full method, then, would look somthing like this:

try:
    content = content.decode(parsed_feed.encoding).encode('ascii', 'xmlcharrefreplace')
except UnicodeDecodeError:
    # Couldn't decode the incoming string -- possibly not encoded in utf-8
    # Do something here to report the error

python - Encoding gives "'ascii' codec can't encode character … ordina...

python django unicode character-encoding
Rectangle 27 12

If the data that you are receiving is, in fact, encoded in UTF-8, then it should be a sequence of bytes -- a Python 'str' object, in Python 2.X

You can verify this with an assertion:

assert isinstance(content, str)

Once you know that that's true, you can move to the actual encoding. Python doesn't do transcoding -- directly from UTF-8 to ASCII, for instance. You need to first turn your sequence of bytes into a Unicode string, by decoding it:

unicode_content = content.decode('utf-8')

(If you can trust parsed_feed.encoding, then use that instead of the literal 'utf-8'. Either way, be prepared for errors.)

You can then take that string, and encode it in ASCII, substituting high characters with their XML entity equivalents:

xml_content = unicode_content.encode('ascii', 'xmlcharrefreplace')

The full method, then, would look somthing like this:

try:
    content = content.decode(parsed_feed.encoding).encode('ascii', 'xmlcharrefreplace')
except UnicodeDecodeError:
    # Couldn't decode the incoming string -- possibly not encoded in utf-8
    # Do something here to report the error

Sign up for our newsletter and get our top new questions delivered to your inbox (see an example).

python - Encoding gives "'ascii' codec can't encode character … ordina...

python django unicode character-encoding
Rectangle 27 3

As long as its not asynchronous (doing sending and receiving at once), you can use the socket interface.

If you like abstractions (or need asynchronous support), there is always Twisted.

Here is an example with the socket interface (which will become harder to use as your program grows larger, so, I would suggest either Twisted or asyncore)

import socket

def mysend(sock, msg):
    totalsent = 0
    while totalsent < MSGLEN:
        sent = sock.send(msg[totalsent:])
        if sent == 0:
            raise RuntimeError("socket connection broken")
        totalsent = totalsent + sent

s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)

s.connect(("where ever you have your other computer", "port number"))

i = 2
mysend(s, str(i))

The python documentation is excellent, I picked up the mysend() function from there.

If you are doing computation related work, check out XML-RPC, which python has all nicely packaged up for you.

Remember, sockets are just like files, so they're not really much different to write code for, so, as long as you can do basic file io, and understand events, socket programming isn't hard, at all (as long as you don't get too complicated like multiplexing VoIP streams...)

Sign up for our newsletter and get our top new questions delivered to your inbox (see an example).

networking - Easiest Way to Transfer Data Over the Internet, Python - ...

python networking data-transfer
Rectangle 27 5

This would be an expected difference between Python and Java. Most likely you aren't seeing differences in the amount of time to make the query, but the amount of time it takes to parse the result and fill the receiving data structure.

You can test this by comparing the time it takes to query a single record. Remember that you'll need to test several times and average the total to get a true benchmark to account for possible fluctuations in latency on the backend.

Both Python and Java are compiled to byte codes which are interpreted on a virtual machine. Python and Ruby are not equivalent in that regard.

You're right, it looks like all the slowdown for the Python code happens when decoding the returned protocol buffer data into Entity objects in the SDK's datastore.py. Small datastore queries (10 objects) show no perceptible performance difference between Java and Python.

@Joshua, the generalization you make at the end of your post is not quite accurate in general, as adam pointed out. In addition, one needs to consider the specific context of app engine, where java apps with low traffic have to pay the startup cost of initializing the entire jvm quite frequently.

From the Python documentation "Python is an interpreted language, as opposed to a compiled one, though the distinction can be blurry because of the presence of the bytecode compiler. This means that source files can be run directly without explicitly creating an executable which is then run."

Is appengine Python datastore query much (>3x) slower than Java? - Sta...

java python google-app-engine google-cloud-datastore
Rectangle 27 4

If your setup contains multiple databases, and you have a test that requires every database, you can use the multi_db attribute on the test suite to request a full flush.

class TestMyViews(TestCase):
    multi_db = True

    def testIndexPageView(self):
        call_some_test_code()

That documentation (multi-database support testing) is not exact, because the condition multi_db (indirectly in _databases_names) is not used only for flush (tearDown) in Django source, but also for '_fixture_setup'. (Django-1.5.1/django/test/testcases.py:834) Therefore it seems to be a basic condition independent on master/slave settings.

Afraid I don't think that's the answer in this case. We have something like a master/slave relationship in production (on Heroku with a "follower" db), but when doing local dev we just point the db config for both to a single db. When the test is constructing the environment, it goes to create a _test copy of each existing database, but since it sees two in the config, it tries to create the same one twice. This is what TEST_MIRROR does fix, as it knows not to create the db for the mirror, but it seems not to inform the ORM to pass through queries from the mirror db to the master.

python - Django testing mirror database not receiving data - Stack Ove...

python django unit-testing testing
Rectangle 27 4

The error you're seeing means the data you receive from the remote end isn't valid JSON. JSON (according to the specifiation) is normally UTF-8, but can also be UTF-16 or UTF-32 (in either big- or little-endian.) The exact error you're seeing means some part of the data was not valid UTF-8 (and also wasn't UTF-16 or UTF-32, as those would produce different errors.)

Perhaps you should examine the actual response you receive from the remote end, instead of blindly passing the data to json.loads(). Right now, you're reading all the data from the response into a string and assuming it's JSON. Instead, check the content type of the response. Make sure the webpage is actually claiming to give you JSON and not, for example, an error message that isn't JSON.

(Also, after checking the response use json.load() by passing it the file-like object returned by opener.open(), instead of reading all data into a string and passing that to json.loads().)

opener.open does not return a file like object: TypeError: expected string or buffer. i edited my post and added the string that is causing the problem

That TypeError means you're still using json.loads() instead of json.load(). opener.open() does return a file-like object, because you use it as one in your code. The JSON string you have is invalid JSON -- \xf6 is not in UTF-8, only in some single-byte encodings (like iso-8859-1.) JSON is not supposed to be given in those encodings, just UTF-8, UTF-16 or UTF-32. You will either have to fix the supplier of the JSON (to make it use \\u00f6 instead of \xf6) or find out what encoding this is and recode into UTF-8 before parsing it as JSON.

oops, sory i did notice the "s". but thank you, i will just give it up )-:

python - UnicodeDecodeError: 'utf8' codec can't decode bytes in positi...

python unicode python-2.x
Rectangle 27 5

The can't adapt error is raised by psycopg2 when it receives an data type that it doesn't know how to translate into a value for a SQL statement. For example, if you accidentally pass a list, say, for a value that is supposed to be an integer, psycopg2 will raise this can't adapt error.

The faq.txt document that ships with the source distribution of psycopg2 explains it this way:

!cursor.execute()

Psycopg converts Python objects in a SQL string representation by looking at the object class. The exception is raised when you are trying to pass as query parameter an object for which there is no adapter registered for its class. See :ref:adapting-new-types for informations.

Well, I was hoping loaddata verbosity would work and I wouldn't have to confess that I've never found an elegant way of debugging adaptation errors with django's loaddata. In the past, I've resorted to inserting print statements in django's loaddata function so that I can see the values being deserialized when the error occurs. I've edited django/core/management/loaddata.py. Look of obj.save() in the handle() function. I hope this confession inspires someone to share a better solution :-)

mmmm.... So what can I do? I simply need to import data into database... Maybe there is another way of doing that?

If you take the same low-level approach I do, open up the copy of django on your system and add a print statement just before obj.save() in the loaddata.py module to see what erroneous data you're getting out of the system. The issue will be the last print statement before the stack trace.

Yep it is. See my latest editing. But I don't understand how the issue appeared... Why DoesNotExist: User matching query does not exist.

No, that did not helped. The error Doesn't exist gone... But can't adapt appeared...

postgresql - django: can't adapt error when importing data from postgr...

django postgresql psycopg2
Rectangle 27 1

When the test environment is configured, a test version of slave will not be created. Instead the connection to slave will be redirected to point at default

Since the slave doesn't really exist in the testing, it kinda makes sense that trying to call it directly fails

I read "Instead the connection to slave will be redirected to point at default" as anything pointed at the slave will pass through to the master, so it seems like it should work. Perhaps it's just a case that I'm reading it wrong (and IMO it could be written more clearly in the docs...)

I'm not an official source, and I get your point, but my understanding is that this only happens in testing environment

python - Django testing mirror database not receiving data - Stack Ove...

python django unit-testing testing
Rectangle 27 3

You're sending from a local variable, but it goes out of scope before writing completes.

Make message a member variable.

NOTE: The img_lock is not thread safe

Here's a working demo:

#ifndef _IMG_SESSION_H_
#define _IMG_SESSION_H_

#include <boost/bind.hpp>
#include <boost/asio.hpp>
#include <iostream>
#include <boost/thread.hpp>

struct session {
    session(boost::asio::io_service& svc) : svc(svc) {
    }

    void clean() {}
    boost::asio::io_service& svc;
    boost::asio::ip::tcp::socket _socket{svc};
    int max_length = 1024;
    //std::array<char, 1024> _data;
    char _data[1024];
};

class img_session : public session
{
public:
    img_session(boost::asio::io_service& io_service, std::string * img_stream, int*  img_lock);
    ~img_session();

    /* Starts the session. This is called in the constructor.*/
    void start();

private:
    void handle_read(const boost::system::error_code& error, size_t bytes_transferred);
    void handle_write(const boost::system::error_code& error, size_t bytes_transferred);

    std::string * img_stream;
    int * img_lock;
    std::string message;
};

#endif /* _IMG_SESSION_H */

/* Public Methods */
img_session:: img_session(boost::asio::io_service& io_service, std::string * img_stream, int * img_lock) : session(io_service), img_stream(img_stream), img_lock(img_lock){
}

img_session::~img_session() { }

void img_session::start()
{
    std::cout << "connection from " << _socket.remote_endpoint() << "\n";
    std::cout << "img_session waiting for READY signal..." << std::endl;
    _socket.async_read_some(boost::asio::buffer(_data, max_length),
        boost::bind(&img_session::handle_read, this,
            boost::asio::placeholders::error,
            boost::asio::placeholders::bytes_transferred));
}

/* Private Methods */

void img_session::handle_read(const boost::system::error_code& error, size_t /*bytes_transferred*/)
{
    if (!error)
    {
        //char otherString[6];
        std::cout << "[INFO ] IMG READ" << std::endl;
        if (strncmp(_data, "READY", 5) == 0) {
            std::cout << "[INFO ] READY signal received..." << std::endl;
            while (*img_lock == 1) boost::this_thread::yield();
            *img_lock = 1;
            message = "0000" + *img_stream;
            size_t len = message.length();

            std::cout << "len = " << len << std::endl;

            message[0] = (len >> 24) & 0xFF;
            message[1] = (len >> 16) & 0xFF;
            message[2] = (len >> 8) & 0xFF;
            message[3] = len & 0xFF;
            std::cout << "img_session.cpp: bytes[] = " << (int) message[0] << " " << (int)message[1] << " " << (int)message[2] << " " << (int)message[3] << std::endl;
            std::cout << "img_session.cpp: message.length() = " << message.length() << std::endl;
            std::cout << "img_session.cpp: HEAD: " << message.substr(4, 1024) << std::endl;
            std::cout << "img_session.cpp: TAIL: " << message.substr(message.length() - 1024, message.length() - 1);


            boost::asio::async_write(_socket,
                boost::asio::buffer(message), 
                boost::bind(&img_session::handle_write, this,
                    boost::asio::placeholders::error, boost::asio::placeholders::bytes_transferred));

            *img_lock = 0;
        }
        else {
            std::cout << "[INFO ] READY signal not received..." << std::endl;
            /*
             *boost::asio::async_write(_socket,
             *    boost::asio::buffer("", 1),
             *    boost::bind(&img_session::handle_write, this,
             *        boost::asio::placeholders::error, boost::asio::placeholders::bytes_transferred));
             */
        }
        clean();
    }
    else
    {
        delete this;
    }
}

void img_session::handle_write(const boost::system::error_code& error, size_t bytes_transferred)
{
    if (!error)
    {
        std::cout << "img_session waiting for READY signal..." << std::endl;
        _socket.async_read_some(boost::asio::buffer(_data, max_length),
            boost::bind(&img_session::handle_read, this,
                boost::asio::placeholders::error,
                boost::asio::placeholders::bytes_transferred));
    }
    else
    {
        std::cout << __FUNCTION__ << ":" << error.message() << "\n";
        delete this;
    }
}

int main() {
    boost::asio::io_service svc;
    std::string img_stream(300<<10, '*');
    int img_lock = 0;
    auto s = new img_session(svc, &img_stream, &img_lock);

    using boost::asio::ip::tcp;

    tcp::acceptor a(svc);
    a.open(tcp::v4());
    a.set_option(tcp::acceptor::reuse_address(true));
    a.bind({{}, 6767});
    a.listen(1);
    a.accept(s->_socket);

    s->start();

    svc.run();
}

This, when using a client

netcat localhost 6767 | wc 
READY
0       1  307204

Python socket not receiving all data from C++ Boost asio - Stack Overf...

python c++ sockets boost boost-asio
Rectangle 27 4

The link with settimeout() was right. It raises a Exception when timeout.

Set a timeout on blocking socket operations. The value argument can be a nonnegative floating point number expressing seconds, or None. If a non-zero value is given, subsequent socket operations will raise a timeout exception if the timeout period value has elapsed before the operation has completed. If zero is given, the socket is put in non-blocking mode. If None is given, the socket is put in blocking mode.

You need to put your code in a try block, so that the Exception doesn't abort your program.

import socket.timeout as TimeoutException
# set timeout 5 second
clientsocket.settimeout(5)
for i in range(0,10):
  sequence_number = i
  start = time.time()
  clientSocket.sendto("Ping " + str(i) + " " + str(start), server)
  # Receive the client packet along with the address it is coming from
  try:
    message, address = clientSocket.recvfrom(1024)
  except TimeoutException:
    print("Timeout!!! Try again...")
    continue
  end = time.time()
  if message != '':
    print message
    rtt = end - start
    print "RTT = " + str(rtt)

Python: fixed wait time for receiving socket data - Stack Overflow

python sockets
Rectangle 27 1

If you are using the Django REST framework, then you are expected to produce model instances (database results) or simple Python primitives (built-in types), and it'll take care of serialisation to JSON for you. By abstracting away the serialization, the framework can implement content-negotiation, where the client can pick what format they receive the data in. That could be JSON, but it could also be something else. I suspect that returning a JSON string is going to upset the assumptions the framework makes.

Return your cursor data in a rest_framework.response.Response object instead, do not serialize this yourself:

from rest_framework.response import Response
from contextlib import closing

# ...
conn = pymysql.connect(host='127.0.0.1', port=3306, user='root', passwd='password123', db='sakila')
with closing(conn), conn as cur:
    with cur:
        cur.execute("SELECT  city_id, city, country_id FROM city")
        return Response(list(cur))

REST framework supports HTTP content negotiation by providing a Response class which allows you to return content that can be rendered into multiple content types, depending on the client request.

Response
SimpleTemplateResponse

Response objects are initialised with data, which should consist of native Python primitives

In the above example I also used contextlib.closing() to ensure the connection is closed even if there are exceptions in the view, and then used the connection as a context manager to produce the cursor, and then the cursor to ensure it too is closed.

If you do have an an actual model, then use the Django ORM and don't create direct connections yourself. You are using a big, well-integrated machine and are ignoring 95% of that machine here. You won't get connection pooling, transaction management, pagination, etc. etc. etc. Just use proper querysets and model views in that case.

I'm doing like this cur.execute("SELECT city_id, city, country_id FROM city") queryset = json.dumps(list(cur))

@NullPointer: yes, that's correct. Please add the full traceback to your question, because if you still get that error then your posted code is not the cause.

python - TypeError: 'method' object is not iterable MySQL - Stack Over...

python mysql django django-rest-framework
Rectangle 27 6

From the "user" point of view, a class method in Python is a method that receives its class as its first parameter - unlike "ordinary" methods which receive an instance of the class as its first parameter - which by convention is called self.

If you retrieve an "ordinary" method from a class, instead of from an instace of that class, you get an "unbound method" - i.e. an object that is a wrapper around a function, but that does not automatically adds either the class itself, nor any instance as the first parameter when it is called. Threfore if you are to call the "unbound method" you have to manually pass an instance of its class as its first parameter.

If you manually call a class method, on the other hand, the class is filled in as the first parameter for you:

>>> class A(object):
...   def b(self):
...      pass
...   @classmethod
...   def c(cls):
...      pass
... 
>>> A.b
<unbound method A.b>
>>> A.c
<bound method type.c of <class '__main__.A'>>
>>> A.c()
>>> A.b()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: unbound method b() must be called with A instance as first argument (got nothing instead)
>>>

Under the hood what goes is more or less like this - with "new style classes":

When one defines a class body, the methods are just ordinary functions - when the class body is over, Python calls the class's metaclass (which ordinarily is the builtin type ) - and pass to it as parameters the name, base classes, and class body dictionary. This call yields a class - which in Python is, an object which is a class, since everything is an object.

Now, Python has some nifty ways of customizing attribute access - the so called "descriptors". A descriptor is any object that defines a method named __get__ (or __set__ or __del__ but we don't care about those here). When one access an attribute of a class or object in Python, the object referred by that attribute is returned - except if it is a class attribute, and the object is a descriptor. In that case, instead of returning the object itself, Python calls the __get__ method on that object, and returns its results instead. For example, the property built-in is just a class that implements both __set__, __get__ and __del__ as appropriate.

Now, what happens when the attribute is retrieved, is that any function (or class method or unbound method, as the data model states) on its body, does have a __get__ method, which makes it a descriptor. Basically, a descriptor that at each attribute access to retrieve the object named as the function as it is defined on the function body, creates a new object around that function - an object that when called will have the first parameter automatically filled in - which is to say, a method.

>>> class B(object):
...    def c(self):
...      pass
...    print c
... 
<function c at 0x1927398>
>>> print B.c
<unbound method B.c>
>>> b = B()
>>> b.c
<bound method B.c of <__main__.B object at 0x1930a10>

If you want to retrieve the function object, without conversion to a method object, you can do so through the class's __dict__ attribute, which does not trigger the descriptor:

>>> B.__dict__["c"]
<function c at 0x1927398>
>>> B.__dict__["c"].__get__
<method-wrapper '__get__' of function object at 0x1927398>
>>> B.__dict__["c"].__get__(b, B)
<bound method B.c of <__main__.B object at 0x1930a10>>
>>> B.__dict__["c"].__get__(None, B)
<unbound method B.c>

As for "class methods", these are just different type of objects, which are explicitly decorated with the builtin classmethod - The object it returns when its __get__ is called is a wrapper around the original function that will fill in the cls as the first parameter on call.

so a bound/unbound user-defined method is "Method of an instance", while a classmethod is "Method of an class, right?

In other words, if you try call unbound method directly from class you are get exception, and for call this method you must create object at first, and then call method "from" object. If you want call method directly from class you can use decorator @staticmethod

@Denis Well, when getting the attribute which is a method decorated with @classmethod, can we say that it is a bound method to the class?

Yes - it is then a classmethod, bound to the class.

oop - Python Data Model Document : an unbound user-defined method obje...

python oop object methods python-datamodel
Rectangle 27 22

You need to use the files parameter to send a multipart form POST request even when you do not need to upload any files.

From the original requests source:

def request(method, url, **kwargs):
    """Constructs and sends a :class:`Request <Request>`.

    ...
    :param files: (optional) Dictionary of ``'name': file-like-objects``
        (or ``{'name': file-tuple}``) for multipart encoding upload.
        ``file-tuple`` can be a 2-tuple ``('filename', fileobj)``,
        3-tuple ``('filename', fileobj, 'content_type')``
        or a 4-tuple ``('filename', fileobj, 'content_type', custom_headers)``,
        where ``'content-type'`` is a string
        defining the content type of the given file
        and ``custom_headers`` a dict-like object 
        containing additional headers to add for the file.

The simplest multipart form request that includes both files to upload and form fields will look like this:

multipart_form_data = {
    'file1': open('myfile.zip', 'rb'),
    'file2': ('custom_file_name.zip', open('myfile.zip', 'rb')),
    'action': ('', 'store'),
    'path': ('', '/path1')
}

response = requests.post('https://httpbin.org/post', files=multipart_form_data)

print(response.content)
pip install requests_toolbelt
files

MultipartEncoder can be used both for multipart requests with or without actual upload fields. It must be assigned to the data parameter.

import requests
from requests_toolbelt.multipart.encoder import MultipartEncoder

multipart_data = MultipartEncoder(
    fields={
            # a file upload field
            'file': ('file.py', open('file.py', 'rb'), 'text/plain')
            # plain text fields
            'field0': 'value0', 
            'field1': 'value1',
           }
    )

response = requests.post('http://httpbin.org/post', data=multipart_data,
                  headers={'Content-Type': multipart_data.content_type})

If you need to send multiple fields with the same name, or if order of form fields is important, then a tuple or a list can be used instead of a dictionary, i.e.:

multipart_data = MultipartEncoder(
    fields=(
            ('action', 'store'), 
            ('path', '/path1'),
            ('path', '/path2'),
            ('path', '/path3'),
           )
    )

Thank you for this. The order of keys was important to me and this helped a lot.

Amazing. Inexplicably, an api I am working with requires 2 different values for the same key. This is amazing. Thank you.

@ccpizza, what actually this line means? > "('file.py', open('file.py', 'rb'), 'text/plain')". It doesn't work for me :(

@DenisKoreyba: this is an example of a file upload field which assumes that the a file named file.py is located in the same folder as your script.

How to send a "multipart/form-data" with requests in python? - Stack O...

python python-2.7 multipartform-data python-requests
Rectangle 27 1

You receive the 404 error since the URL is not valid. In order to solve it, apart from changing the xml path using a correct file extension (answer from GAEfan), remove the "_ah" of the URL.

appcfg.py upload_data --url=http://app-id.appspot.com/remote_api --kind=xml --filename=myfile.xml

python - Unable to upload data to google app engine - Stack Overflow

python google-app-engine google-cloud-storage google-cloud-datastore
Rectangle 27 22

If you want a histogram, you don't need to attach any 'names' to x-values, as on x-axis you would have bins:

import matplotlib.pyplot as plt
import numpy as np
%matplotlib inline
x = np.random.normal(size = 1000)
plt.hist(x, normed=True, bins=30)
plt.ylabel('Probability');

However, if you have limited number of data points, and you want a bar plot, then you may attach labels to x-axis:

x = np.arange(3)
plt.bar(x, height= [1,2,3])
plt.xticks(x+.5, ['a','b','c']);

You would solve OP's question much better if you had used his object, i.e. probability.

Remember, no semicolons at the end of the lines in python!

@Toad22222 This is an excerpt from Ipython notebook cell. Try to execute it without semicolon and see the difference. All the code snippets I post on SO run perfectly on my computer.

How to plot a histogram using Matplotlib in Python with a list of data...

python matplotlib visualization data-visualization
Rectangle 27 1

I've come back to problem after a long time. The issue appears to be that Apache treats a CustomLog like a file -- something it can open, write to, close, and then reopen at a later date. This causes the receiving process to be told that it's input stream has been closed. However, that doesn't mean the processes input stream cannot be written to again, just that whichever process was writing to the input stream will not be writing to it again.

The best way to deal with this is to setup a handler and let the OS know to invoke the handler whenever input is written to standard input. Normally you should avoid heavily relying on OS signal event handling as they are relatively expensive. However, copying a megabyte of text to following only produced two SIGIO events, so it's okay in this case.

import sys
import os
import signal
import fcntl
import threading

io_event = threading.Event()

# Event handlers should generally be as compact as possible.
# Here all we do is notify the main thread that input has been received.
def handle_io(signal, frame):
    io_event.set()

# invoke handle_io on a SIGIO event
signal.signal(signal.SIGIO, handle_io)
# send io events on stdin (fd 0) to our process 
assert fcntl.fcntl(0, fcntl.F_SETOWN, os.getpid()) == 0
# tell the os to produce SIGIO events when data is written to stdin
assert fcntl.fcntl(0, fcntl.F_SETFL, os.O_ASYNC) == 0

print("pid is:", os.getpid())
while True:
    data = sys.stdin.read()
    io_event.clear()
    print("got:", repr(data))
    io_event.wait()

How you might use this toy program. Output has been cleaned up due to interleaving of input and output.

$ echo test | python3 fancyecho.py &
[1] 25487
pid is: 25487
got: 'test\n'
$ echo data > /proc/25487/fd/0
got: 'data\n'
$

Python wait until data is in sys.stdin - Stack Overflow

python wait
Rectangle 27 42

Well, you want to have an answer that is up-to-date and modern.

When I need to mail in python, I use the mailgun API wich get's a lot of the headaches with sending mails sorted out. They have a wonderfull app/api that allows you to send 10,000 emails per month for free.

Sending an email would be like this:

def send_simple_message():
    return requests.post(
        "https://api.mailgun.net/v3/YOUR_DOMAIN_NAME/messages",
        auth=("api", "YOUR_API_KEY"),
        data={"from": "Excited User <mailgun@YOUR_DOMAIN_NAME>",
              "to": ["bar@example.com", "YOU@YOUR_DOMAIN_NAME"],
              "subject": "Hello",
              "text": "Testing some Mailgun awesomness!"})

You can also track events and lots more, see the quickstart guide.

This is indeed much more "of this date". Though it uses an external API. I'd personally use something like yagmail for keeping it internal.

@PascalvKooten Absolutely amusing to follow your constant advertising for yagmail (yes, Sir, I will consider it next time, Sir ;). But I find it very confusing that almost no one seems to care for OPs issue, but rather suggests much different solutions. It's as if I am asking how to change bulbs in my 2009 smart and the answer is: Buy a real Mercedes...

@flaschbier The reason no one cares about the OPs issue is because the title is wrong. "How to send an email with Python?" is the actual reason people come to look when they click that question, and they'd expect an answer that yagmail can provide: nice and short. There you go. More yagmail advertisement.

@PascalvKooten No offence. I was assuming your mission is to provide better email support in Python and I love that. Had I found yagmail last time I had to implement mail notifications, I absolutely would have considered it (the MIT license and installing 3rd party software would have been possible in that environment). Regarding the title of the question, I think you are absolutely right. Why not suggest an edit?

How to send an email with Python? - Stack Overflow

python email function smtplib
Rectangle 27 73

To read only the first row of the csv file use next() on the reader object.

with open('some.csv', newline='') as f:
  reader = csv.reader(f)
  row1 = next(reader)  # gets the first line
  # now do something here 
  # if first row is the header, then you can do one more next() to get the next row:
  # row2 = next(f)

or :

with open('some.csv', newline='') as f:
  reader = csv.reader(f)
  for row in reader:
    # do something here with `row`
    break

Thanks on the answer, but the second option not recommended.

file - How to read one single line of csv data in Python? - Stack Over...

python file csv iterator next