I have a program that creates a TCP server socket and listens on it. When the program accepts a connection from a client, it calls poll(2) repeatedly to see if there is any input available, like this: for (int i = 0; i < 1000000; i++) { struct pollfd fds[] = {{ .fd = clientfd, .events = POLLIN, .revents = 0 }}; int r = poll(fds, 1, 0); if (r < 0) { perror(poll); exit(1); } } On my Intel iMac, this takes about 15 seconds. If I use select(2) instead, as follows, it takes about 550ms. for (int i = 0; i < 1000000; i++) { struct timeval timeout = { .tv_sec = 0, .tv_usec = 0 }; fd_set infds; FD_ZERO(&infds); FD_SET(clientfd, &infds); int r = select(1, &infds, 0, 0, &timeout); if (r < 0) { perror(select); exit(1); } } https://gist.github.com/xrme/cd42d3d753ac00861e439e012366f2cf is a C source file that contains a complete example. If you're inclined to take a look, please download the file from the gist and do cc poll-test.c and then ./a.out. Connect to it with nc 127.0.
Topic:
App & System Services
SubTopic:
Core OS