Difference between CR LF, LF and CR line break types?

Questions : Difference between CR LF, LF and CR line break types?

I’d like to know the difference (with examples if possible) between CR LF (Windows), LF (Unix) and CR (Macintosh) line break types.

Total Answers: 10 Answers 10


Popular Answers:

  1. It’s really just about which bytes are stored in a file. CR is a bytecode for carriage return (from the days of typewriters) and LF similarly, for line feed. It just refers to the bytes that are placed as end-of-line markers.

    Way more information, as always, on wikipedia.

  2. It’s really just about which bytes are stored in a file. CR is a bytecode for carriage return (from the days of typewriters) and LF similarly, for line feed. It just refers to the bytes that are placed as end-of-line markers.

    Way more information, as always, on wikipedia.

  3. Jeff Atwood has a recent blog post about this: The Great Newline Schism

    Here is the essence from Wikipedia:

    The sequence CR+LF was in common use on many early computer systems that had adopted teletype machines, typically an ASR33, as a console device, because this sequence was required to position those printers at the start of a new line. On these systems, text was often routinely composed to be compatible with these printers, since the concept of device drivers hiding such hardware details from the application was not yet well developed; applications had to talk directly to the teletype machine and follow its conventions. The separation of the two functions concealed the fact that the print head could not return from the far right to the beginning of the next line in one-character time. That is why the sequence was always sent with the CR first. In fact, it was often necessary to send extra characters (extraneous CRs or NULs, which are ignored) to give the print head time to move to the left margin. Even after teletypes were replaced by computer terminals with higher baud rates, many operating systems still supported automatic sending of these fill characters, for compatibility with cheaper terminals that required multiple character times to scroll the display.

  4. CR – ASCII code 13

    LF – ASCII code 10.

    Theoretically CR returns cursor to the first position (on the left). LF feeds one line moving cursor one line down. This is how in old days you controled printers and text-mode monitors. These characters are usually used to mark end of lines in text files. Different operating systems used different conventions. As you pointed out Windows uses CR/LF combination while pre-OSX Macs use just CR and so on.

  5. Systems based on ASCII or a compatible character set use either LF (Line feed, 0x0A, 10 in decimal) or CR (Carriage return, 0x0D, 13 in decimal) individually, or CR followed by LF (CR+LF, 0x0D 0x0A); These characters are based on printer commands: The line feed indicated that one line of paper should feed out of the printer, and a carriage return indicated that the printer carriage should return to the beginning of the current line.

    Here is the details.

  6. The sad state of “record separators” or “line terminators” is a legacy of the dark ages of computing.

    Now, we take it for granted that anything we want to represent is in some way structured data and conforms to various abstractions that define lines, files, protocols, messages, markup, whatever.

    But once upon a time this wasn’t exactly true. Applications built-in control characters and device-specific processing. The brain-dead systems that required both CR and LF simply had no abstraction for record separators or line terminators. The CR was necessary in order to get the teletype or video display to return to column one and the LF (today, NL, same code) was necessary to get it to advance to the next line. I guess the idea of doing something other than dumping the raw data to the device was too complex.

    Unix and Mac actually specified an abstraction for the line end, imagine that. Sadly, they specified different ones. (Unix, ahem, came first.) And naturally, they used a control code that was already “close” to S.O.P.

    Since almost all of our operating software today is a descendent of Unix, Mac, or MS operating SW, we are stuck with the line ending confusion.

  7. CR and LF are a special set of characters that helps format our code.

    1. CR(/r) stand for CARRIAGE RETURN. It puts the cursor at the beginning of a line but doesn’t create a new line. This is how MAC OS works.

    2. LF(/n) stands for LINE FEED. It creates a new line but doesn’t put the cursor at the beginning of that line. The cursor stays back at the end of the last line. This is how Unix and Linux work.

    3. CRLF (/r/f) creates a new line as well as puts the cursor at the beginning of the new line. This is how we see it in Windows OS.

    Git uses LF by default. so when we use Git on Windows it throws a warning like- “CRLF will be replaced by LF” and automatically converts all CRLF into LF, so that code becomes compatible. NB- Don’t worry…see this less as a warning and more as a notice thing.

  8. NL derived from EBCDIC NL = x’15’ which would logically compare to CRLF x’odoa ascii… this becomes evident when physcally moving data from mainframes to midrange. Coloquially (as only arcane folks use ebcdic) NL has been equated with either CR or LF or CRLF

  9. Quote from “Eloquent Javascript” 3rd edition by Marijn Haverbeke:

    The difference in meaning between undefined and null is an accident of Javascript’s design, and it doesn’t matter most of the time. In cases where you actually have to concern yourself with these values, I recommend treating them as mostly interchangeable

    Honestly, at first, I am a bit skeptical about this advice. However, in my own interpretation, it is a lazy (vs eager) way to deal with their differences. Maybe, we don’t have to deal with the differences at all. If we have to, we can delay our concern (util we have to) and not hyperactively/defensively worry about it every step of the way as those values (null and undefined) flow through our code.

    PS: This is not a direct answer to your question. This is just a related opinion.

  10. The type of null is Object, while the type of undefined is undefined. Null means ‘no value’, while undefined means ‘not existing’.

    typeof undefined; //undefined typeof null; // Object undefined !== null; //true undefined == null; //true undefined === null; //false var var1; var1; //undefined  var var2 = null; var2; //null 
  11. Generally – don’t use null to avoid confusion.

    1. Standard library methods return undefined, not null
    let a = [10]; console.log(a[1]) //=> undefined console.log(a.find(value => value === 5)) //=> undefined 
    1. I see often in people’s code that some variable was undefined at first, then assigned to some value, then cleared by setting to null. That’s not consistent, better to set back to undefined.

    Still, null makes sense if framework uses it, or for json serialization.

  12. const data = { banners: null } const { banners = [] } = data; console.log(data) // null const data = { banners: undefined } const { banners = [] } = data; console.log(data) // [] 
  13. let a = null; console.log(a); // null let b; console.log(b); // undefined
  14. If a variable is not initialized then it is undefined. undefined is not a object. Example: var MyName; console.log(typeof MyName);

    Check the console log in development tool, it will be printed as undefined.

    null is a a object.If you want some variable to be null then null is used.null variable exists but value is not known.It should be assigned to a variable pro grammatically. null is not automatically initialized.

    Example : var MyName = null; console.log(typeof MyName); Check the csole log in development tool, it will be an object.

  15. Just to add my views –

    A variable that is declared only, and never used anywhere, is removed off by an optimizing compiler in case of compiled languages like C++[or a warning is flagged in the IDE]. It ultimately means that the variable is non-existent because it’s memory is never allocated.

    In case of javascript interpreter, [I guess] a variable is treated as existing only from the point onwards where it is given a value. Before that point, it’s type is “undefined”, and no memory is allocated for it. And, so its type is undefined.

    A null in javascript is a value that represents an address, but that address points to nothing yet[non-existent reference]. Nevertheless, its a value.

  16. Null is ALWAYS an unknown object that exists in memory, whereas undefined is not.

Tasg: line-breaks,