Rectangle 27 156

The goal of InputStream and OutputStream is to abstract different ways to input and output: whether the stream is a file, a web page, or the screen shouldn't matter. All that matters is that you receive information from the stream (or send information into that stream.)

InputStream is used for many things that you read from.

OutputStream is used for many things that you write to.

Here's some sample code. It assumes the InputStream instr and OutputStream osstr have already been created:

int i;

while ((i = instr.read()) != -1) {
    osstr.write(i);
}

instr.close();
osstr.close();

@KorayTugay A stream is generally defined as a set of characters. To be more precise, more than one bit or character is called as a stream.

java - What is InputStream & Output Stream? Why and when do we use the...

java io inputstream outputstream
Rectangle 27 78

InputStream is used for reading, OutputStream for writing. They are connected as decorators to one another such that you can read/write all different types of data from all different types of sources.

To read the written contents:

File file = new File("C:/text.bin");
DataInputStream stream = new DataInputStream(new FileInputStream(file));
boolean isTrue = stream.readBoolean();
int value = stream.readInt();
stream.close();
System.out.printlin(isTrue + " " + value);

You can use other types of streams to enhance the reading/writing. For example, you can introduce a buffer for efficiency:

DataInputStream stream = new DataInputStream(
    new BufferedInputStream(new FileInputStream(file)));

You can write other data such as objects:

MyClass myObject = new MyClass(); // MyClass have to implement Serializable
ObjectOutputStream stream = new ObjectOutputStream(
    new FileOutputStream("C:/text.obj"));
stream.writeObject(myObject);
stream.close();

You can read from other different input sources:

byte[] test = new byte[] {0, 0, 1, 0, 0, 0, 1, 1, 8, 9};
DataInputStream stream = new DataInputStream(new ByteArrayInputStream(test));
int value0 = stream.readInt();
int value1 = stream.readInt();
byte value2 = stream.readByte();
byte value3 = stream.readByte();
stream.close();
System.out.println(value0 + " " + value1 + " " + value2 + " " + value3);

For most input streams there is an output stream, also. You can define your own streams to reading/writing special things and there are complex streams for reading complex things (for example there are Streams for reading/writing ZIP format).

java - What is InputStream & Output Stream? Why and when do we use the...

java io inputstream outputstream
Rectangle 27 19

A stream is a sequence of data.

The data source and data destination pictured above can be anything that holds, generates, or consumes data. Obviously this includes disk files, but a source or destination can also be another program, a peripheral device, a network socket, or an array.

import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;

public class CopyBytes {
    public static void main(String[] args) throws IOException {

        FileInputStream in = null;
        FileOutputStream out = null;

        try {
            in = new FileInputStream("xanadu.txt");
            out = new FileOutputStream("outagain.txt");
            int c;

            while ((c = in.read()) != -1) {
                out.write(c);
            }
        } finally {
            if (in != null) {
                in.close();
            }
            if (out != null) {
                out.close();
            }
        }
    }
}

Have a look at this SE question to know more details about advanced Character streams, which are wrappers on top of Byte Streams :

java - What is InputStream & Output Stream? Why and when do we use the...

java io inputstream outputstream
Rectangle 27 26

I would probably drain the input stream into a byte[] using ByteArrayOutputStream and then create a new ByteArrayInputStream based on the result every time I need to reread the stream.

ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buf = new byte[1024];
int n = 0;
while ((n = myInputStream.read(buf)) >= 0)
    baos.write(buf, 0, n);
byte[] content = baos.toByteArray();

InputStream is1 = new ByteArrayInputStream(content);
... use is1 ...

InputStream is2 = new ByteArrayInputStream(content);
... use is2 ...

Related, and possibly useful, questions and answers:

java - How can I reopen a closed InputStream when I need to use it 2 t...

java android inputstream
Rectangle 27 8

you read from an InputStream and write to an OutputStream.

for example, say you want to copy a file. You would create a FileInputStream to read from the source file and a FileOutputStream to write to the new file.

If your data is a character stream, you could use a FileReader instead of an InputStream and a FileWriter instead of an OutputStream if you prefer.

InputStream input = ... // many different types
OutputStream output = ... // many different types

byte[] buffer = new byte[1024];
int n = 0;
while ((n = input.read(buffer)) != -1)
    output.write(buffer, 0, n);

input.close();
output.close();
close
flush

java - What is InputStream & Output Stream? Why and when do we use the...

java io inputstream outputstream
Rectangle 27 5

OutputStream is an abstract class that represents writing output. There are many different OutputStream classes, and they write out to certain things (like the screen, or Files, or byte arrays, or network connections, or etc). InputStream classes access the same things, but they read data in from them.

Here is a good basic example of using FileOutputStream and FileInputStream to write data to a file, then read it back in.

java - What is InputStream & Output Stream? Why and when do we use the...

java io inputstream outputstream
Rectangle 27 2

Your stack trace and code do not seem to match up. From the stack trace it looks like doInBackground is recursively calling itself and then InputStream.reset(). But I can see neither call in your code.

Regarding your actual problem re-reading the stream: Since you apparently already tried (and failed) with InputStream.reset(), the amount of data in that stream is probably too large for that (or you forgot to call InputStream.mark()). The easy way is to create another connection + stream and read that. It does mean that you will actually and inefficiently transfer that data twice from the URL.

I think is better to use the first option, to read entire contents from memory / temporary file / Document. So the best thing is to read the Document and write it lately in a file, for example

If available memory is not an issue, load the contents into a byte array and then use ByteArrayInputStream based on that. Only thing that might be a cause for trouble is, if that XML has any relative references (eg. XML includes). And that case can easily be fixed by passing the original URL of the document to the parser (ie. DocumentBuilder.parse(InputStream is, String systemId)).

java - Why InputStream is closed when I use a BuffferedReader? - Stack...

java android inputstream bufferedreader
Rectangle 27 3

After doing a lot of research and reading different posts and blogs, I was finally able to resolve my issue.

I referred the questions asked here and got the idea for doing this.

The other solutions suggested in this thread didn't workout for me.

Here is what I did,

Used a URIResolver instead of parameter.

public class DocumentURIResolver implements URIResolver {

final Map<String, Document> _documents;

public DocumentURIResolver(final Map<String, Document> documents) {
    _documents = documents;
}

public Source resolve(final String href, final String base) {
    final Document doc = _documents.get(href);
    return (doc != null) ? new DOMSource(doc) : null;
    }
}
public ByteArrayOutputStream merge(final InputStream file1,final InputStream file2) {
final ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
final TransformerFactory tFactory = TransformerFactory.newInstance();
Transformer transformer;
try {
    transformer = tFactory.newTransformer(new StreamSource("merge.xslt"));
    final DocumentBuilder db = DocumentBuilderFactory.newInstance().newDocumentBuilder();
    final Document documentFile = db.parse(file2);
    Map<String, Document> docs = new HashMap<String, Document>();
    docs.put("lookup", documentFile);
    transformer.setURIResolver(new DocURIResolver(docs));
    transformer.transform(new StreamSource(file1), new StreamResult(outputStream));
} catch (final TransformerConfigurationException e) {
    LOG.warn("Problem occurred transforming files configuration issue", e);
} catch (final TransformerException e) {
    LOG.warn("Problem occurred transforming files", e);
}
return outputStream;
}
<?xml version="1.0"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
<xsl:strip-space elements="*"/>

<xsl:variable name="lookup" select="document('documentFile')"/>

<xsl:template match="/">
  Do the processing how you want to
</xsl:template>
</xsl:stylesheet>

java - Passing a XML File (InputStream) to XSLT avoid using Document i...

java xml xslt javax.xml
Rectangle 27 2

You can use apache.commons.net.ftp.FTPClient (is in commons-net.jar library).

You have methods as: setRestartOffset(yourOffset), using it before retrieve the file, the file data will start from the specified offset.

java - How to resume reading an InputStream using the FTP protocol - S...

java ftp inputstream skip bufferedinputstream
Rectangle 27 3

You can pass a test String into the method as an InputStream like this (:

InputStream stream = new ByteArrayInputStream(exampleString.getBytes());
String str = "\n\n\n";
InputStream stream = new ByteArrayInputStream(str.getBytes());
System.out.println(foo(stream));

@JBNizet you're right. I removed the custom charset.

@KorayTugay as I said in my - now deleted - comment: the code uses the default charset to read from the stream. So if that is the correct thing to do, the string should also be transformed to bytes using this default charset.

java - How can I unit test a method that uses a Scanner and an InputSt...

java unit-testing inputstream fileinputstream
Rectangle 27 3

You can pass a test String into the method as an InputStream like this (:

InputStream stream = new ByteArrayInputStream(exampleString.getBytes());
String str = "\n\n\n";
InputStream stream = new ByteArrayInputStream(str.getBytes());
System.out.println(foo(stream));

@JBNizet you're right. I removed the custom charset.

@KorayTugay as I said in my - now deleted - comment: the code uses the default charset to read from the stream. So if that is the correct thing to do, the string should also be transformed to bytes using this default charset.

java - How can I unit test a method that uses a Scanner and an InputSt...

java unit-testing inputstream fileinputstream
Rectangle 27 3

The Provider should be triggered only in specific cases, so it shouldn't affect your whole application. But if you need the body in the requests handled by the provider, then you still have a workaround:

  • implement a servlet Filter
  • in the wrapper cache the request body, and then override the getInputStream() method to return a ByteArrayInputStream with the cached request body. That way it can be read muiltiple times.

spring's AbstractRequestLoggingFilter does something similar and has an example wrapper, you can check it.

That sounds likely; just to shore up my understanding, "wrap the request" means extend ServletRequest and pass that on when I call chain.doFilter(request, response). The Provider is indeed only triggered in specific cases, but of course, the case where it is triggered is the one where I need both the body in the provider and the handler method. Anyhow, thanks, and I'll accept the answer as soon as I get this running.

yes, there's HttpServletRequestWrapper that helps you - you extend it, instantiate it, passing the original request, and it delegates all methods, except the overridden ones

java - Spring: How to use @RequestBody after reading inputStream - Sta...

java spring http rest inputstream
Rectangle 27 3

The Provider should be triggered only in specific cases, so it shouldn't affect your whole application. But if you need the body in the requests handled by the provider, then you still have a workaround:

  • implement a servlet Filter
  • in the wrapper cache the request body, and then override the getInputStream() method to return a ByteArrayInputStream with the cached request body. That way it can be read muiltiple times.

spring's AbstractRequestLoggingFilter does something similar and has an example wrapper, you can check it.

That sounds likely; just to shore up my understanding, "wrap the request" means extend ServletRequest and pass that on when I call chain.doFilter(request, response). The Provider is indeed only triggered in specific cases, but of course, the case where it is triggered is the one where I need both the body in the provider and the handler method. Anyhow, thanks, and I'll accept the answer as soon as I get this running.

yes, there's HttpServletRequestWrapper that helps you - you extend it, instantiate it, passing the original request, and it delegates all methods, except the overridden ones

java - Spring: How to use @RequestBody after reading inputStream - Sta...

java spring http rest inputstream
Rectangle 27 1

For discrete objects you can use a producer/consumer pattern with ConcurrentLinkedQueues or BlockingQueues - each module has its own queue, and it will continually poll its queue (or use take if it's a BlockingQueue), process the object, and offer it to the next module's queue.

This pattern can also work with a byte stream if you chunk it into smaller byte arrays that you pass through the queues, but this isn't always appropriate (e.g. if module1 reads, module2 compresses, and module3 encrypts, then you're probably better off keeping the data in streams, unless you have some reasonable way to chunk the data).

Yes, my case is very much like module1 parses, module2 compresses, and module3 encrypts.. Chunking data to make it fit in ConcurrentLinkedQueues or BlockingQueues pattern doesn't make sense. And it all happens in sequence. Simple as it is no concurrency involved.

I'll definitely hear and learn more about this but so far it appears to me that InputStream/OutputStream is badly designed. It'd have been a nice and clear way to handle data interface.

copy

java - Using InputStream/OutputStream to handle data flows through mod...

java inputstream outputstream dataflow
Rectangle 27 1

For discrete objects you can use a producer/consumer pattern with ConcurrentLinkedQueues or BlockingQueues - each module has its own queue, and it will continually poll its queue (or use take if it's a BlockingQueue), process the object, and offer it to the next module's queue.

This pattern can also work with a byte stream if you chunk it into smaller byte arrays that you pass through the queues, but this isn't always appropriate (e.g. if module1 reads, module2 compresses, and module3 encrypts, then you're probably better off keeping the data in streams, unless you have some reasonable way to chunk the data).

Yes, my case is very much like module1 parses, module2 compresses, and module3 encrypts.. Chunking data to make it fit in ConcurrentLinkedQueues or BlockingQueues pattern doesn't make sense. And it all happens in sequence. Simple as it is no concurrency involved.

I'll definitely hear and learn more about this but so far it appears to me that InputStream/OutputStream is badly designed. It'd have been a nice and clear way to handle data interface.

copy

java - Using InputStream/OutputStream to handle data flows through mod...

java inputstream outputstream dataflow
Rectangle 27 1

You could create a general module which implements the OutputStream to InputStream conversion and place one instance of the module between each of your other modules. You could even get fancy with it and make the module intelligent enough to route the messages from one module to any of the other modules. This would become a sort of gateway or router type module.

Alternatively, you could implement something a bit heavier weight with a message queuing and passing framework like ZeroMQ. --ap

Put the OutputStream to InputStream conversion as a utility class is the way I am doing it. But it doesn't change the fact that it is awkward and counter-intuitive. What I'm saying is if stream is the way to go, there would have been some elegant ways to do OutputStream/InputStream conversion, instead of all sorts of hacks. Even Apache IOUtils only implements copying from InputStream to OutputStream but not the other way around.

java - Using InputStream/OutputStream to handle data flows through mod...

java inputstream outputstream dataflow
Rectangle 27 1

You could create a general module which implements the OutputStream to InputStream conversion and place one instance of the module between each of your other modules. You could even get fancy with it and make the module intelligent enough to route the messages from one module to any of the other modules. This would become a sort of gateway or router type module.

Alternatively, you could implement something a bit heavier weight with a message queuing and passing framework like ZeroMQ. --ap

Put the OutputStream to InputStream conversion as a utility class is the way I am doing it. But it doesn't change the fact that it is awkward and counter-intuitive. What I'm saying is if stream is the way to go, there would have been some elegant ways to do OutputStream/InputStream conversion, instead of all sorts of hacks. Even Apache IOUtils only implements copying from InputStream to OutputStream but not the other way around.

java - Using InputStream/OutputStream to handle data flows through mod...

java inputstream outputstream dataflow
Rectangle 27 2

On the server-side you can simply store data in blobstore via the File API.

This is not ideal, since it is experimental, but I will give it a try.

We use it already about 9 months and it works without a problem.

@Jarrod - it's a bitch when an API is recently changed for an answer that is a year old, isn't it? Btw, thanks for the -1

@PeterKnego - GCS is for large files. What if I want an API to store small files?

How to store an Inputstream (image) into Google App Engine Blobstore u...

java google-app-engine blobstore
Rectangle 27 1

I try to keep as much in RAM as possible (mostly because of performance reasons and RAM is cheap). So I'm using a FileBackedBuffer to "save" data of unknown size. It has a limit. When less than limit bytes are written to it, it will keep them in an internal buffer. If more data is written, I'll create the actual file. This class has methods to get an InputStream and an OutputStream from it, so the using code isn't bothered with the petty details.

bytearray - Java Project Modules - use InputStream/OutputStream or .tm...

bytearray resources inputstream temporary-files outputstream
Rectangle 27 1

public class OuterObject {
    List<ClassToStore> messages;    
}
Gson gson = new Gson();
Type type = new TypeToken<List<OuterObject>>(){}.getType();
List<OuterObject> outerList = gson.fromJson(reader, type);
List<ClassToStore> listOfMessages = outerlist.get(0).messages;

JSON parsing only one branch of whole message java using GsonBuilder a...

java android json parsing