<URL: http://www.hughes.net/~scheper/surf.htm >
<URL: mailto:scheper@hughes.net >
Quick instructions: Download the latest SURF program Win32 (Windows 95/98/NT) executable file from http://www.hughes.net/~scheper/surf.exe. Then save SURF.EXE into a directory in your path, for example, to filename C:\WINDOWS\SURF.EXE or C:\WINNT\SURF.EXE. That's all the installation it needs. Then open an MS-DOS prompt and type SURF to see its usage message.
Surf is an open source freeware Internet text web browser for the Windows 95, Windows 98, and Windows NT operating systems.
Surf fetches Internet web pages, converting the contents from HTML into narrowly wrapped text using the US-ASCII character set. Every web page is saved as one text file in the current directory. The saved files have an HTM file extension and contain just enough HTML markup to preserve the web page address, title, and clickable hypertext links.
Surf tabulates both the previously fetched and the as-yet unfetched URLs (Uniform Resource Locator, or web address) that it encounters into a single editable URL list file using a pure text format.
If any URL is resolved from an anchor tag, area tag, frame tag, or redirection header and meets these criteria: it is an absolute URL, it uses the HTTP method, and it does not end in any of the file extensions .aif, .au, .dll, .gif, .gz, .hqx, .java, .jpeg, .jpg, .map, .mp2, .mpg, .pdf, .pl, .vmd, .wav, .z, .zip; then Surf will record that URL in the URL list file whenever Surf is either fetching web pages or is reprocessing locally saved files.
In advance of each Surf execution you manually determine the list of URLs to fetch by editing the list file and removing an ASCII asterisk character from one or more interesting URLs that you want Surf to collect next. I often edit to delete all asterisks before surfing so as to fetch all the URLs mentioned in a links page, or to harvest an entire site from their index page in just a few iterations of editing and surfing.
Fetched web pages create files in the current directory, smartly named from the best letters of the URL path ending and sometimes of the URL domain part, and coerced to always have eight or fewer filename characters plus a ".htm" file extension, and to be unique in the current directory.
While fetching web pages, Surf does not create files for any HTTP completion statuses other than success. Surf automatically follows all 301 or 302 redirections to save only the real target web page. When Surf fetches a web page containing frames, it will eventually fetch all of the web pages anchored in that frame, which get listed without an asterisk in the URL list.
Surf fetches URLs in a random order, and will stop whenever there are no more unfetched URLs remaining in the list, or whenever you press a key within the current MS-DOS Prompt console window in which Surf is running. Surf then writes out the updated URL list file and exits.
Surf has no facilities for displaying the collected text files. You can either use an editor or your conventional browser to read them. -- Go into Windows Explorer and double-click any HTML file, and it should automatically be displayed using your browser.
Surf grooms the URL list file to serve as your guide to read the best of the collected web pages, because it orders URLs according to a weighted sum of three quality indices: uncommon vocabulary terms count, apparent sentence count, and off-page link count. The URL list also helps you to guess the topics discussed before reading a page by showing a few of the most frequently use words.
Surf also has a powerful and streamlined searching facility. With a single command, you can send your query to dozens of primary search engines around the world. The best thirty-eight search engines were chosen from hundreds, based on their suitability especially with English language queries, boolean AND methods, proper names, and quoted phrases; also based on their speed (as assessed using a T1 line in California), hit quantity, quality, and uniqueness, and most importantly that Surf could understand and automatically parse their query result pages.
Whenever Surf fetches or reparses a query result page that can be recognized as coming from one of Surf's internally listed suitable search engines, Surf will record all the target web pages found by that search into the URL list file without asterisks, or remove any asterisk if that URL was already listed. Therefore, as fetching continues, target pages will begin to be fetched, often as many as a thousand reasonably relevant web pages, all automatically.
With slight effort, Surf can rearrange your collected web page files into a variety of directories (folders) according to how they rank against a list of words and word fragments you supply. Surf follows its command-line argument file specifications that match directories recursively. Therefore Surf can be applied to reorganize or create a URL list for all of the locally saved web page files that you have organized topically into a hierarchy of subdirectories.
After you have edited your URL list file to create an ultimate link collection about your particular area of interest, Surf can write the fetched URLs in the list into an alternate HTML format as an ordered, numbered and annotated web page of clickable links.
Surf is the opportunistic development of a software engineer intent upon researching the Internet for text information in certain fields. Surf is optimized for English, and cannot process wide characters, as are used for Japanese, Chinese, Korean, etc. The Surf program source code, SURF.CPP, is provided so that any "C" programmer having a Microsoft Visual C++ 5.0 compiler would be able to create a custom version.
Concepts from a sample program named "TEAR" as distributed with the Microsoft Visual C++ 5.0 compiler became the basis of a program segment that fetches Internet URLs using HTTP. Otherwise, Surf is all the original work of Glenn Scheper, and was developed by him independently of his employment in the software field.
Surf.cpp and Surf.exe Copyright (C) 1999 Glenn Scheper.
This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
All components of the latest program version will be available
at the following fixed URLs in the author's personal web site:
http://www.hughes.net/~scheper/surf.exe -- The Win32 executable.
http://www.hughes.net/~scheper/surf.cpp -- The C++ source code.
http://www.hughes.net/~scheper/surf.htm -- This text/html manual.
http://www.hughes.net/~scheper/surf.zip -- A pkzip of those 3
files.
The easiest way to download the program is have your current web browser open the executable file, surf.exe. When the browser asks you what to do with the file, tell it to save the file to disk, and choose some destination directory that is listed in your environment's execution "path" variable. If you are running Windows 95 or Windows 98, you could normally save surf.exe into the directory C:\Windows. If you are running Windows NT, you could normally save surf.exe into the directory C:\WINNT.
In addition to the executable file, surf.cpp, you can also fetch the C++ source code file from the relative location surf.cpp, also this text / HTML user guide which you are now reading from the location surf.htm. You can download all of those three files Pkzip'ed together from the location surf.zip.
Software distribution sites will probably offer the zipped file, either named surf.zip or surf14.zip. After downloading and saving the zip file, you must extract its contents by running a generally available file decompression program such as Pkunzip, or Winzip.
To install surf, merely place the executable file surf.exe into any directory listed in your execution path as described above, or in the current directory where you want to execute it.
To uninstall surf, merely delete the surf.exe file.
Surf does not create any registry entries when it executes, nor create any files other than the fetched HTML and URL list files. However, the Windows libraries used by Surf to fetch URLs will have similar side effects as they would when running Microsoft's Internet Explorer program. For example, running Surf may cache copies of the fetched files and may create cookie files or logs. The libraries may have other side effects that I do not realize.
Surf is a "console application" type of program, so it does not have a graphical user interface. Instead Surf is invoked by typing the name Surf possibly followed by some arguments, upon the console screen that is presented when you execute the Windows MS-DOS Prompt program.
The possible Surf command line arguments are described below.
The Surf program only tries to fetch one Internet resource at a time, but the Windows operating system can fetch many Internet resources concurrently. You can double-click the MS-DOS Prompt program several times, and execute Surf in each of the MS-DOS Prompts. If you do that, you should prefer to start each instance of Surf in a different current directory, and each instance of Surf should be processing a different URL list file.
For example, the path c:\i\q is where I do all my surfing. If I wanted to search for something, or fetch everything mentioned in some links page, or drag in all the HTML files from one web site, or just explore random links extending out from all my cached files, I would first make a new directory under c:\i\q using the MS-DOS command md, then enter that directory using the command cd, and prepare a file list.txt if necessary, then I would run Surf and let Surf use its default file name for the URL file, list.txt.
To facilitate running multiple MS-DOS Prompts, I suggest you create a few shortcuts to your MS-DOS Prompt program and assign a hot key to each shortcut. For example, on my Windows 95 system, I would do as follows: Right click the overall Start button, and select Open. Then double click to open the folder named Programs, or any other folders you need to lead to your MS-DOS Prompt program icon. Right click on the MS-DOS Prompt program icon, and select Copy. Then go back and right click on your desktop, and click Paste Shortcut once, or more than once.
For each of the MS-DOS shortcuts, right click the icon on the desktop, and rename them differently. I liked A, B, C, etc. Then right click the icon named A again, and select properties. Click the tab named Program. Click the field named Working, and type in a convenient starting directory. I liked c:\i\q. Click the field named Shortcut Key and type one unique letter, like A for the A icon, B for the B icon, etc. I also like to click the properties tab named Screen and select Full Screen for each window. Finally, click Ok. Now from any window, typing the Ctrl + Alt + A hot key will go straight to that "A" copy of the MS-DOS Prompt, and similarly with Ctrl + Alt + B and with Ctrl + Alt + C, etc.
You can use those hot keys, or type Alt + Tab to shuffle through all the MS-DOS windows to monitor the surfing progress. Whenever the focus is in any one window, typing a keystroke will stop that Surf's execution, and write an updated URL list file. Once you stop Surf to examine the URL list, or if you stop Surf by accident, and you want to resume fetching from the default URL list named list.txt where Surf left off, just type "surf" with no arguments.
"Everything you can do, I can do better..." I was amazed to see someone manipulate a batch file and run Surf directly from the Microsoft Windows Explorer program. From then on, I adjusted the Surf output file extensions to TXT and HTM to permit automatic Notepad or browser invocation by double clicking the data files.
To operate Surf from Windows Explorer, I execute the Windows Explorer program, then double-click my \i and \i\q folders. To create a new subdirectory to hold my work, I do File, New, Folder, and then rename the new folder according to my current topic of interest, like "Glenn" when searching for my name.
I then double click on the new folder, so that it will be the current folder when I execute Surf, which is in the same sense as being "in" some current directory when executing from the MS-DOS Prompt.
I start to create a batch file, by doing File, New, Text Document. I double-click the newly created text document to both edit its contents and change its extension using Notepad. Changing the file extension using Notepad is an important trick. Read on...
Suppose I want to use Surf to do a search for my name. I would type the following one line without quotes into the opened text file: "surf -q Glenn Scheper". Then, I would do File, Save As. I must change two fields in this dialog box: Before saving the document in notepad, I must first change the Save As Type selection, from Text Documents to All Files. I can then type in a new filename with a ".BAT" extension, like "X.BAT" and click Save to write the file, then I can do File, Exit to leave Notepad.
Observe that the new X.BAT file is shown with an executable icon rather than a text file icon in the Windows Explorer pane. That would not have happened if we had used Windows Explorer to merely right-click and rename the file as X.BAT, in which case it would still remain a text file, and it would have the name "X.BAT.TXT".
Now double click your batch file, and watch the HTML files start appearing in that pane of Windows Explorer showing the current folder. You can immediately double-click the HTML files to start viewing them, which will run your default conventional web browser, without interfering with the Surf process.
Whether you use the MS-DOS Prompt console screen, or use Windows Explorer to invoke Surf, the key to controlling Surf's activity lies with the specific parameters passed to Surf on the command line, and also with the contents that you edited in the URL list file, if any.
The URL file name is optional. Therefore Surf tests its first parameter for having a minus sign to determine if it is a flag. If the first parameter does not start with a minus sign, then the first parameter is assumed to specify the path of the URL list file, in which case any second parameter must consist of a minus sign and one of the few valid flag letters.
If a URL list file name is specified, is must not use any file naming metacharacters, such as ? or *, because Surf will not expand those to find the URL list file. A simple file name might locate a file in the current directory. Otherwise the file name may contain a drive letter followed by a colon, and may contain a directory path from the root or relative to the current directory. The URL file name may end with an arbitrary file extension, or no extension.
If Surf is invoked without a command letter parameters, it assumes the desired function is to start, or to continue fetching web pages already listed in the URL list file. In this case, if Surf cannot locate the default (or specified) URL list file, it presents a short usage message showing all of Surf's invocation possibilities.
The valid command letters are (-f, -q, -a, -b, -r, -w) as well as the case of no letter, which continues fetching. You might remember these letters as (Fetch, Query, Anchor, Base, Relocate, and Write).
The -f or Fetch operation expects all further arguments are URLs to be fetched. For typing convenience and in this location only, you may wish to leave off the "http://" part of these URLs. An alternative to achieve the -F function is to first edit the URL list file to contain all the desired URLs, with their http:// method prefixes, then execute Surf with no flag letter to fetch them.
The -q or Query operation expects all further arguments are parts of one search engine query. If there are multiple arguments, they will be joined together with single space characters. That query string is appended in turn to each of the thirty-eight search engine query URLs that are hard coded into the Surf program, and Surf proceeds to fetch the listed URLs, along with any matching web page URLs that the search engines locate.
The -a or Anchor variation of an operation to reparse local files accepts all further arguments as local file naming specifications. If no file name specifications are given, -A will reparse every file in the current directory and in all directories under it. The URL list file at either the specified or else the default file name will be created or else it will be augmented, so that it will contain all of both the fetched and unfetched URLs that can be parsed from all the files. If the saved files contain BASE markup tags, then incompletely specified (relative) URLs can also be resolved and saved, otherwise they cannot.
The -b or Base variation of an operation to reparse local files accepts all further arguments as local file naming specifications. If no file name specifications are given, -B will reparse every file in the current directory and in all directories under it. The URL list file at either the specified or else the default file name will be created or else it will be augmented, so that it will contain all of the fetched URLs, such as can be determined from any BASE markup tags that may have been saved in each of the files. Note that Surf always inserts a BASE tag in the locally saved files during fetching operations.
The -r or Relocate operation expects all further arguments are local file naming specifications. The files to be relocated are determined from these arguments, and not from the URL list file. -R is also special because it must find a preexisting URL list file, and that list file must contain file rearrangement information, as I shall describe below. When executing Surf with the -R command, the list file will not be modified, so the same list file may serve again and again to categorize files.
The -w or Write operation reformats whatever it finds in the URL list file, and creates an HTML web page describing all the fetched URLs. The output is written to the standard output file, so it must be captured by appending a "Greater Than" (right angle bracket) character to the Surf command line, and follow the "Greater Than" character with the desired output file name.
All lines which start with any character, except for an ASCII space character or a Commercial At sign character (@), are interpreted as absolute URL specifications. If such lines do not contain a validly formed URL, they will be ignored when the URL list file is read. Each URL in the URL list file must be on one whole text line by itself, starting with an "H" or "h" character of an explicit "http://" method prefix as the first character of the line and continuing until the URL ends at the final character of the line, with no extra spaces or tabs either before, after, or in the middle of the URL.
Any lines that start with a space character are considered to be notes to attach to a preceding URL. There is no requirement to have a note with a URL. The very first note line attached to a URL will determine that URL's eligibility for fetching. If the note line contains an ASCII Number (Pound) sign (#) as the first character following the initial space character then it marks that URL as having already been fetched. If instead, an asterisk (*) appears in that position, the URL is recognized as a novel URL that has not yet been authorized for fetching. If any other character appears there, the URL is a candidate for fetching. Any URL without notes will automatically be a candidate for fetching.
Any lines that start with no spaces but start with a Commercial At sign (@) as their very first character supply the needed file relocation parameters for -R operations. Each line should contain a directory path immediately after the Commercial At sign, followed by a space, then by zero or more vocabulary words or word fragments. For example, but omitting the quotes: "@c:\full\path\exs excess* access xxxx*". Such paths may be absolute or relative, and the directories do not have to previously exist; Surf -R will create them. If there is any line containing just a path with no matching words after it, that will be the default path to collect all files which contain no match terms at all, as well as all search engine query result files. In the absence of such a default path, files not matching any of the listed words will not be moved during the -R relocation operation. Your lines of matching vocabulary terms should use only alphabetic characters (a-z) and will never be able to match any of the words that are hard coded in the CommonWordList array in the surf.cpp program. An asterisk at the end of any word fragment grants a match to all words matching up to the asterisk.
Even now, as I write this paragraph, I have two other MS-DOS prompt console screens active with Surf fetching web pages. Later, when they finish or I stop them, I will go to one of the two working directories, open the URL list file with my text editor, and, as either the web page titles or their notes of most-used words pique my interest, I will open and read some of the locally saved web page text files.
If the web page contains a feature not supported by Surf, such as a graphic, an HTML form that I'd like to submit, or data that is only visible when using Java or Javascript, I could open the locally saved web page in a conventional browser, and click on a link that Surf inserts at the top of each page, to visit the original Internet resource.
I used to use a conventional browser a lot, Netscape Navigator 3, so I had quite a large bookmark file. It became my first starting place for surfing. After locating the bookmark file buried deep in, for example, "C:\Program Files\Netscape\Navigator\bookmark.htm", I would go there in an MS-DOS prompt window and create a list.txt file by saying "surf -a bookmark.htm". Then I would carry the new list.txt file off to a clean directory, remove all the asterisks, and say "surf".
Another way to start my URL list was to extract the absolute URLs from all the web pages that I had recently visited, that were still present in the browser cache. In the case of Netscape Navigator 3, it was by going in MS-DOS to "C:\Program Files\Netscape\Navigator\Cache", then saying "surf -a". In the case of Netscape Navigator 4, the cache moved to under the path "C:\Program Files\Netscape\Users\...\Cache".
In the case of Microsoft Internet Explorer, you can still scoop up all the URLs from its cache, but the cache is much harder to find and enter. As I found it on my system, the cache is spread across four subdirectories that each have hidden file attributes, but they can be revealed by using the MS-DOS DIR command /AH option, as follows: cd "C:\WINDOWS\Temporary Internet Files"; dir /ah; You should find four directories named cache1, cache2, cache3 and cache4. You can now do: cd cache1; where one-quarter of the cached files will be accessible. You could perhaps run: surf \x -a; in each of the four directories, each time adding on to the same URL list file named x in the root directory.
To fetch one URL, a -F invocation might be best because you would not have to type any http:// method prefix, as for example: Surf -f www.hughes.net/~scheper/surf.htm; which would fetch this Surf user manual into a local (text/html) file named surf.htm, or if that name was taken, into the next available filename surf1.htm, surf2.htm, etc. Surf would also create a new list.txt file in the current directory or augment a previously existing list.txt file, so list.txt would list all the URLs found in the fetched web page.
To download an entire web site, fetch one page, preferably the top page or site index page. Edit your URL list, and search down for the line bearing this legend, "----NOVEL LINKS----". Search further down for the first occurrence of an URL to that web site. All of the links staying in that site should appear in one block, or perhaps two blocks, both with and without a sometimes optional "www." prefix. Remove the asterisks after each of the URLs going back to that site, close the editor, and surf the list again. Repeat editing and Surfing until no more new URLs are discovered.
Next, I have an example in which I invoked Surf from an MS-DOS batch file in order to conveniently pass a word to be used as the query, and look up the definition of that word by fetching a query URL from a dictionary web server. This batch file already anticipates the particular filename Surf will create after it follows the 301 or 302 redirection URL, then the batch file goes on to invoke a text editor to read the file.
@echo off
rem - DICT.BAT -- Arg 1 is the word to look up.
rem - there is always a redirection to new url:
rem - http://work.ucsd.edu:5141/cgi-bin/http_webster?...
surf dictlist -f http://c.gp.cs.cmu.edu:5103/prog/webster?%1
del dictlist
edit httpwebs.htm
del httpwebs.htm
Browsing is great for leisure reading, or to develop a library, but when you need to know about something specific, right away, you want to access one or more Internet search engines.
Surf automates the massive retrieval of information from searching queries as much as possible. It may be, sometimes this is not the kind of searching you want, and sending a brief search engine query from your conventional browser might be preferable.
If you let Surf send your search queries, it will prepare 38 URLs by joining the best URLs from many hundred search engine URLs with your specific query part, and beginning to surf those listed URLs. As always, the URLs will be fetched in random order. To be friendly, only one page of results will be requested from each search engine.
The selected search engine result formats have been analyzed to mechanically distinguish hit URLs indicating matching web pages from all the other URLs returned on the search result pages. Such hit URLs will be added to the URL list without asterisks, which means they will automatically begin to be fetched randomly along with the remaining search engine query URLs.
The search engine query result pages are specially distinguished from any target web pages because they are given an Underscore (_) in the first character of their locally saved file names. That permits easy deletion of the query result pages with the MS-DOS command "DEL _*", if it is not desired to pull additional result pages from each search engine. Any additional result URLs will be listed in the URL list file, with an asterisk, and could be followed up like any other URLs, to search exhaustively.
A -Q invocation is used to load up the URLs for each query. Surf may be stopped immediately by any keystroke if you wish to examine the URL list file created by the -Q invocation. To resume fetching, simply say: Surf; without repeating the -Q command line arguments.
A simple query for one keyword would look like this: Surf -q keyword; Many common search engine conventions are applicable to lots of the search engines that Surf queries. For example, to search a person's name, capitalize the parts of it, like this: Surf -q Glenn Scheper;
To search for a phrase, embed a pair of double quote marks around the whole phrase. It's tricky: I first use a pair of double quotes to join the words together, and inside that phrase I embed the two double quotes by typing each as one back-slash followed by one quote character, like this: Surf -q "\"this exact phrase\"";
If you are writing a batch file that will be invoking Surf, watch out to convert any percent (%) characters into a pair of percent characters, (%%), because a single percent would normally be seen as a macro reference, like %1, which is used to access the first invocation argument of the batch file.
The URL list after running Surf -Q will be cluttered with many more advertising URLs than interesting URLs. It is best to rebuild the URL list file after all or sufficient of the query result and target hit pages have been fetched. First, use the command: DEL _*; to remove all of the search result pages. Then, do: DEL LIST.TXT; and either: Surf -a; or: Surf -b; to rebuild a fresh new list.txt.
Another step I might take is to use DIVIDE, another of my Win32 freeware programs, to separate all the files that do not have the five letter sequence {space, t, h, e, space}. That step usually gets rid of all the non-text resources that produced empty files, and most of the foreign language, that is, non-english, files.
Remember, when faced with a few hundred potentially interesting new web pages, the key to reading the choicest web pages first is to start reading the fetched pages that are listed first in the URL list file.
Whenever I finish with a query, or dragging in a site, I run a Surf -R operation to relocate all the files that have keywords of interest to me into my permanent holdings, rather than just discarding the files. In this operation, I always refer to a specific URL list file that has my list of interesting match terms. It doesn't actually have any URLs.
The text lines in that filtering "URL" list file look like this:
@\i\yes ancient archaeolog* caesar* civil histor* holocaust medieval
sect* secular soldier* soviet war
@\i\yes aromatherap* aspect* astrolog* chart* crystal* compatab*
deck divinat* dream* fortun* house* jupiter mars mercury moon
mystic* neptune numerolog* occult* oil* orac* ouija planet* pluto
psychic* reading* sign* spirit* star* tarot venus
@\i\no account* banner buscar busq* command* design* director*
employ* engine engineer* graph* guest* guide* industr* internet
isp member metasearch multimedia query ring search suchma* traff*
user* web
@\i\no
That final line having no match terms following the path \i\no will collect all files that didn't match any of the filtering terms, and all recognized search engine query result pages, into that path. In addition, I actively listed terms that often occur on pages I don't want to keep, to push those pages more toward the undesired path, \i\no, for eventual manual file deletion.
I have another version of that "URL" list file, in which the first line sends its matching web pages into the path \i\anci, and the second line sends its matching web pages into the path \i\astr, etc. The categorization is a little fuzzy, and some web pages will not fall in the directory you wish. Just keep adding to your match terms. The number of times that a web page has words matching any term listed in a (@) line are summed up for that line. The path on the highest scoring line receives the page. If there is no default path, files with no matching terms will not be moved.
After segregating files into directories according to predominant topic, I use the Surf -B invocation to build a fresh URL list for each topical directory, then use the Surf -W invocation to write an HTML links page describing those holdings, for publishing on my web site. Of course, it is desirable to manually review and screen the contents of the URL list.
If some of the file relocation lines, starting with (@) are placed in the URL list file along with the URLs before doing Surf -W, Surf will list subsections according to the matching of the few most used words to the relocation match terms. This is a hold over from an older file relocation method, and is not as robust as the division by Surf -R.
Periodically, before I update my link pages, I segregate all the best scoring web pages by manually editing the URL list file, then running Surf -R. On this prime pool, I rebuild the URL list with Surf -A, then I take the asterisks off and fetch another generation of choice pages.
There is as yet no method for Surf to automatically update files that have changed, or have later file dates than the last saved file date. My method is to build a URL list, delete all those oldest HTML files, edit to remove Number signs (#) from the URL list, then Surf the list.
Hereafter to the end follows the entire licence document from
http://www.fsf.org/copyleft/gpl.html
. I hope the redundant HTML markups (base, html, head, title,
and body) doesn't confuse your browser. If so, go to its URL.
Version 2, June 1991
Copyright (C) 1989, 1991 Free Software Foundation, Inc. 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed.
The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This General Public License applies to most of the Free Software Foundation's software and to any other program whose authors commit to using it. (Some other Free Software Foundation software is covered by the GNU Library General Public License instead.) You can apply it to your programs, too.
When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it.
For example, if you distribute copies of such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights.
We protect your rights with two steps: (1) copyright the software, and (2) offer you this license which gives you legal permission to copy, distribute and/or modify the software.
Also, for each author's protection and ours, we want to make certain that everyone understands that there is no warranty for this free software. If the software is modified by someone else and passed on, we want its recipients to know that what they have is not the original, so that any problems introduced by others will not reflect on the original authors' reputations.
Finally, any free program is threatened constantly by software patents. We wish to avoid the danger that redistributors of a free program will individually obtain patent licenses, in effect making the program proprietary. To prevent this, we have made it clear that any patent must be licensed for everyone's free use or not licensed at all.
The precise terms and conditions for copying, distribution and modification follow.
0. This License applies to any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License. The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language. (Hereinafter, translation is included without limitation in the term "modification".) Each licensee is addressed as "you".
Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted, and the output from the Program is covered only if its contents constitute a work based on the Program (independent of having been made by running the Program). Whether that is true depends on what the Program does.
1. You may copy and distribute verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and give any other recipients of the Program a copy of this License along with the Program.
You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee.
2. You may modify your copy or copies of the Program or any portion of it, thus forming a work based on the Program, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions:
Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Program.
In addition, mere aggregation of another work not based on the Program with the Program (or with a work based on the Program) on a volume of a storage or distribution medium does not bring the other work under the scope of this License.
3. You may copy and distribute the Program (or a work based on it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you also do one of the following:
If distribution of executable or object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place counts as distribution of the source code, even though third parties are not compelled to copy the source along with the object code.
4. You may not copy, modify, sublicense, or distribute the Program except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense or distribute the Program is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance.
5. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Program or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Program (or any work based on the Program), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Program or works based on it.
6. Each time you redistribute the Program (or any work based on the Program), the recipient automatically receives a license from the original licensor to copy, distribute or modify the Program subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties to this License.
7. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Program at all. For example, if a patent license would not permit royalty-free redistribution of the Program by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Program.
If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply and the section as a whole is intended to apply in other circumstances.
It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system, which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice.
This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License.
8. If the distribution and/or use of the Program is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Program under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions of the General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the Program specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of this License, you may choose any version ever published by the Free Software Foundation.
10. If you wish to incorporate parts of the Program into other free programs whose distribution conditions are different, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally.
NO WARRANTY
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found.
one line to give the program's name and an idea of what it does. Copyright (C) yyyy name of author This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307, USA.
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) yyyy name of author Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, the commands you use may be called something other than `show w' and `show c' ; they could even be mouse-clicks or menu items--whatever suits your program.
You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the program, if necessary. Here is a sample; alter the names:
Yoyodyne, Inc., hereby disclaims all copyright interest in the program `Gnomovision' (which makes passes at compilers) written by James Hacker. signature of Ty Coon , 1 April 1989 Ty Coon, President of Vice
This General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Library General Public License instead of this License.
FSF & GNU inquiries & questions to gnu@gnu.org . Other ways to contact the FSF.
Comments on these web pages to webmasters@www.gnu.org , send other questions to gnu@gnu.org .
Copyright notice above.
Free Software Foundation, Inc., 59 Temple Place - Suite 330, Boston,
MA 02111, USA
Updated: 16 Feb 1998 tower