Registreer FAQ Ledenlijst Berichten van vandaag


Ga terug   Scholieren.com forum / Technologie / Software & Hardware
Reageren
 
Topictools Zoek in deze topic
Oud 02-12-2002, 20:16
niemand
Avatar van niemand
niemand is offline
Nou denk ik dat het eigenlijk weinig zin heeft om dit hier te vragen maar ik doe het toch maar want ik word hier een beetje wanhopig van.

Ik ben bezig voor een schoolproject om een webcam uit te lezen via de JMF (Java Media Framework) in Java.
Het is de bedoeling dat we als dat lukt het beeld direct op het scherm zetten.
Daarvoor hebben we de onderstaande code geschreven/opgezocht. jmfWebcam.java en FrameGrabberException komen van een artikel bij Sun.

De code werkt goed op mijn 2GHz maar op de 400MHz krijgen we een vertraging in het beeld van ongeveer 2-4 seconden wat vervelend is omdat we het plaatje zo snel mogelijk in het programma moeten hebben.
Dus wat we zoeken is een manier om dit programma sneller te krijgen.

We gebruiken op beide pc's dezelde webcam op dezelfde resolutie en we gebruiken JDK1.3.1 en JMF2.1.1b onder Win98SE.

Om precies te zijn zijn de pc's:
1. AMD XP2000+ 256MB DDR-333
2. PII 400MHz 64 MB RAM (weet niet hoe snel)


Omdat het nogal veel code is is het ook te downloaden op mijn site: http://oege.ie.hva.nl/~romijn09/


De volgende tips heb ik al gehad maar daarmee ben ik niet tot een oplossing gekomen:
- Resolutie verlagen (werkt wel maar mag niet)
- Frame verkleinen
- Niet zoveel repainten en dubbele buffer uitzetten


Ik denk dat het probleem in de volgende richting zit: Op het moment dat er een webcam object aangemaakt word, gaat ie continu plaatjes uitlezen en die in een buffer plaatsen. Als je dan een plaatje opvraagt krijg je een plaatje uit die buffer dat tenzij het het enige plaatje is niet een realtime image is.

Dus daarom loopt er een thread die continu die buffer uitleest. Ik denk zelf dat die thread gewoon veel te traag loopt, iemand ideeen om de prioriteit op te hogen?


Nog ff voor de duidelijkheid: in die jmfWebcam.java staat ergens
Code:
  private final static String DEFAULT_DEV_NAME = 
    "vfw:Logitech USB Video Camera:0"; 
  private final static String DEFAULT_X_RES = "160"; 
  private final static String DEFAULT_Y_RES = "120";
maar die gebruikt ie niet, hij leest in vanuit een bestand video.properties waarin staat dat ie "vfw:USB PC Camera:0" gebruikt op 640x480.

webcam.java:
Code:
package webcamtoscreen; 

import java.applet.*; 
import java.awt.*; 
import java.awt.image.BufferedImage; 
import java.io.*; 
import java.awt.image.*; 
import java.awt.Component; 
import webcamtoscreen.jmfWebcam.*; 
import webcamtoscreen.FrameGrabberException; 


public class webcam extends Frame { 
    static jmfWebcam camera; 
    Image OSC; 
    int widthOfOSC, heightOfOSC; 
    static BufferedImage image; 

    public static void main(String[] args) { 
        new webcam(); 
        System.out.println("frame gemaakt"); 

        try{ 
            camera = new jmfWebcam(); 
            System.out.println("jmfwebcam gemaakt"); 
            camera.start(); 
        } 
        catch(FrameGrabberException e){ 
            System.out.println(e); 
        } 

        image = camera.getBufferedImage(); 
    } 


    public void update(Graphics g) { //dubbele buffer 
        if (OSC == null || widthOfOSC != getSize().width || heightOfOSC != getSize().height) { 
            OSC = null; 
            OSC = createImage(getSize().width, getSize().height); 
            widthOfOSC = getSize().width; 
            heightOfOSC = getSize().height; 
        } 

        Graphics OSG = OSC.getGraphics(); 
        OSG.setColor(getBackground()); 
        OSG.fillRect(0, 0, widthOfOSC, heightOfOSC); 
        OSG.setColor(getForeground()); 
        OSG.setFont(getFont()); 
        paint(OSG); 
        g.drawImage(OSC,0,0,this); 
    } 


    public void paint( Graphics g ) { 
        
        // kijk of er al een eerste keer een image is ingelezen 
        if (image!=null){ 
           slaap(100); 
           image = camera.getBufferedImage(); 
           g.drawImage( image,0,0,this ); 
        } 
         
        repaint(); 
   } 


    public void slaap (int m){ 
        try {Thread.sleep(m); } 
    catch (InterruptedException e) {} 

    } 


    public webcam() { 
         //Title our frame. 
        super("Length Vision - Webcam"); 

        //Set the size for the frame. 
        setSize(640,480); 

        //We need to turn on the visibility of our frame 
        //by setting the Visible parameter to true. 
        setVisible(true); 
    } 
}
jmfWebcam.java:
Code:
/** 
*  JMF/Webcam Frame Grabber Demo 
* 
* @author S.Ritter  24.01.2002 
* @version 1.0 
* 
*  ALL EXAMPLES OF CODE AND/OR COMMAND-LINE INSTRUCTIONS ARE BEING 
*  PROVIDED BY SUN AS A COURTESY, "AS IS," AND SUN DISCLAIMS ANY AND 
*  ALL WARRANTIES PERTAINING THERETO, INCLUDING ANY WARRANTIES OF 
*  MERCHANTABILTY, FITNESS FOR A PARTICULAR PURPOSE, OR NON-INFRINGEMENT. 
*  SUN IS NOT LICENSING THIS EXAMPLE FOR ANY USE OTHER THAN FOR THE 
*  EDUCATIONAL PURPOSE OF SHOWING THE FUNCTIONALITY CONTAINED 
*  THEREIN, BY WAY OF EXAMPLE. 
**/ 
package webcamtoscreen; 

import java.io.*; 
import java.awt.*; 
import java.awt.image.*; 
import java.util.Enumeration; 
import java.util.Properties; 
import java.util.Vector; 
import javax.media.*; 
import javax.media.control.*; 
import javax.media.protocol.*; 
import javax.media.util.*; 
import javax.media.format.RGBFormat; 
import javax.media.format.VideoFormat; 

/** 
*  Frame grabber class 
**/ 
public class jmfWebcam extends Thread implements ControllerListener { 
  /*  Default device name and format parameters to use if no properties file 
   *  is provided 
   */ 
  private final static String DEFAULT_DEV_NAME = 
    "vfw:Logitech USB Video Camera:0"; 
  private final static String DEFAULT_X_RES = "160"; 
  private final static String DEFAULT_Y_RES = "120"; 
  private final static String DEFAULT_DEPTH = "24"; 

  private Properties videoProperties; 

  /*  These objects are used for controlling access via wait and notify to 
   *  ensure that the processor has been realised and the second thread has 
   *  completed it's startup 
   */ 
  private Object stateLock = new Object(); 
  private Object runLock = new Object(); 

  private Processor deviceProc = null; 
  private PushBufferStream camStream; 
  private PushBufferDataSource source = null; 
  private BufferToImage converter; 
  private Image currentImage; 
  private boolean threadRunning = false; 

  /** 
   *  Constructor 
   * 
   * 
   * @throws FrameGrabberException If we can't start up the camera 
   **/ 
  public jmfWebcam() throws FrameGrabberException { 
    /*  If the user chooses the no parameter form of the constructor we 
     *  try to get the video property file name from the properties 
     *  passed on the command line 
     */ 
    String videoPropFile = 
      System.getProperty("video.properties", "video.properties"); 

    setup(videoPropFile); 
  } 

  /** 
   *  Constructor 
   * 
   * @param videoPropFile The name of the video properties file 
   * @throws FrameGrabberException If we can't start up the camera 
   **/ 
  public jmfWebcam(String videoPropFile) throws FrameGrabberException { 
    setup(videoPropFile); 
  } 

  /** 
   *  Setup method.  Configures webcam and JMF ready to get images 
   * 
   * @param videoPropFile The name of the video properties file 
   * @throws FrameGrabberException If we can't start up the camera 
   **/ 
  private void setup(String videoPropFile) throws FrameGrabberException { 
    videoProperties = new Properties(); 

    if (videoPropFile != null) { 
      try { 
        FileInputStream fis = new FileInputStream(new File(videoPropFile)); 
        videoProperties.load(fis); 
      } catch (IOException ioe) { 
        System.out.println("Unable to access video properties"); 
        System.out.println(ioe.getMessage()); 
      } 
    } 

    Dimension viewSize = null; 
    int viewDepth = 0; 

    String cameraDevice = 
      videoProperties.getProperty("device-name", DEFAULT_DEV_NAME); 

    /*  Get the parameters for the video capture device from the properties 
     *  file.  If not defined use default values 
     */ 
    try { 
      String pValue = 
        videoProperties.getProperty("resolution-x", DEFAULT_X_RES); 
      int xRes = Integer.parseInt(pValue); 
      pValue = videoProperties.getProperty("resolution-y", DEFAULT_Y_RES); 
      int yRes = Integer.parseInt(pValue); 
      viewSize = new Dimension(xRes, yRes); 
      pValue = videoProperties.getProperty("colour-depth", DEFAULT_DEPTH); 
      viewDepth = Integer.parseInt(pValue); 
    } catch (NumberFormatException nfe) { 
      System.out.println("Bad numeric value in video properties file"); 
      System.exit(1); 
    } 

    /*  Try to get the CaptureDevice that matches the name supplied by the 
     *  user 
     */ 
    CaptureDeviceInfo device = CaptureDeviceManager.getDevice(cameraDevice); 

    if (device == null) 
      throw new FrameGrabberException("No device found [ " + 
        cameraDevice + "]"); 

    RGBFormat userFormat = null; 
    Format[] cfmt = device.getFormats(); 

    /*  Find the format that the user has requested (if available)  */ 
    for (int i = 0; i < cfmt.length; i++) { 
      if (cfmt[i] instanceof RGBFormat) { 
        userFormat = (RGBFormat)cfmt[i]; 
        Dimension d = userFormat.getSize(); 
        int bitsPerPixel = userFormat.getBitsPerPixel(); 

        if (viewSize.equals(d) && bitsPerPixel == viewDepth) 
          break; 

        userFormat = null; 
      } 
    } 

    /*  Throw an exception if we can't find a format that matches the 
     *  user's criteria 
     */ 
    if (userFormat == null) 
      throw new FrameGrabberException("Requested format not supported"); 

    /*  To use this device we need a MediaLocator  */ 
    MediaLocator loc = device.getLocator(); 

    if (loc == null) 
      throw new FrameGrabberException("Unable to get MediaLocator for device"); 

    DataSource formattedSource = null; 

    /*  Now create a dataSource for this device and set the format to 
     *  the one chosen by the user. 
     */ 
    try { 
      formattedSource = Manager.createDataSource(loc); 
    } catch (IOException ioe) { 
      throw new FrameGrabberException("IO Error creating dataSource"); 
    } catch (NoDataSourceException ndse) { 
      throw new FrameGrabberException("Unable to create dataSource"); 
    } 

    /*  Setting the format is rather complicated.  Firstly we need to get 
     *  the format controls from the dataSource we just created.  In order 
     *  to do this we need a reference to an object implementing the 
     *  CaptureDevice interface (which DataSource objects can). 
     */ 
    if (!(formattedSource instanceof CaptureDevice)) 
      throw new FrameGrabberException("DataSource not a CaptureDevice"); 

    FormatControl[] fmtControls = 
      ((CaptureDevice)formattedSource).getFormatControls(); 

    if (fmtControls == null || fmtControls.length == 0) 
      throw new FrameGrabberException("No FormatControl available"); 

    Format setFormat = null; 

    /*  Now we need to loop through the available FormatControls and try 
     *  to set the format to the one we want.  According to the documentation 
     *  even though this may appear to work, it may fail later on.  Since 
     *  we know that the format is supported we hope that this won't happen 
     */ 
    for (int i = 0; i < fmtControls.length; i++) { 
      if (fmtControls[i] == null) 
        continue; 

      if ((setFormat = fmtControls[i].setFormat(userFormat)) != null) 
        break; 
    } 

    /*  Throw an exception if we couldn't set the format  */ 
    if (setFormat == null) 
      throw new FrameGrabberException("Failed to set camera format"); 

    /*  Connect to the DataSource  */ 
    try { 
      formattedSource.connect(); 
    } catch (IOException ioe) { 
      throw new FrameGrabberException("Unable to connect to DataSource"); 
    } 

    /*  Since we don't want to display the output to the user at this stage 
     *  we use a processor rather than a player to get frame access 
     */ 
    try { 
      deviceProc = Manager.createProcessor(formattedSource); 
    } catch (IOException ioe) { 
      throw new FrameGrabberException("Unable to get Processor for device: " + 
        ioe.getMessage()); 
    } catch (NoProcessorException npe) { 
      throw new FrameGrabberException("Unable to get Processor for device: " + 
        npe.getMessage()); 
    } 

    /*  In order to use the controller we have to put it in the realized 
     *  state.  We do this by calling the realize method, but this will 
     *  return immediately so we must register a listener (this class) to 
     *  be notified when the controller is ready. 
     */ 
    deviceProc.addControllerListener(this); 
    deviceProc.realize(); 

    /*  Wait for the device to send an event telling us that it has 
     *  reached the realized state 
     */ 
    while (deviceProc.getState() != Controller.Realized) { 
      synchronized (stateLock) { 
        try { 
          stateLock.wait(); 
        } catch (InterruptedException ie) { 
          throw new FrameGrabberException("Failed to get to realized state"); 
        } 
      } 
    } 

    deviceProc.start(); 

    /*  Get access to the PushBufferDataSource which will provide us with 
     *  a means to get at the frame grabber 
     */ 
    try { 
      source = (PushBufferDataSource)deviceProc.getDataOutput(); 
    } catch (NotRealizedError nre) { 
      /*  Should never happen  */ 
      throw new FrameGrabberException("Processor not realized"); 
    } 

    /*  Now we can retrieve the PushBufferStreams that will enable us to 
     *  access the data from the camera 
     */ 
    PushBufferStream[] streams = source.getStreams(); 
    camStream = null; 

    for (int i = 0; i < streams.length; i++) { 
      /*  Use the first Stream that is RGBFormat (there should be only one  */ 
      if (streams[i].getFormat() instanceof RGBFormat) { 
        camStream = streams[i]; 
        RGBFormat rgbf = (RGBFormat)streams[i].getFormat(); 
        converter = new BufferToImage(rgbf); 
        break; 
      } 
    } 

    System.out.println("Capture device ready"); 
  } 

  /** 
   *  Get an image from the camera as an AWT Image object 
   * 
   * @returns The current image from the camera 
   **/ 
  public Image getImage() { 
    /*  Since we are using a second thread to grab the images from the webcam 
     *  we need to ensure that an image has been aquired. 
     *  We do this by using a flag which will be set to true in the run() 
     *  method.  If this is false we wait until the run method notifies us 
     *  that there is an image to collect 
     */ 
    while (threadRunning == false) { 
      synchronized (runLock) { 
        try { 
          runLock.wait(); 
        } catch (InterruptedException ie) { 
          // Ignore 
        } 
      } 
    } 

    return accessInternalImage(null); 
  } 

  /** 
   *  Get an image from the camera as a BufferedImage 
   * 
   * @returns The current image from the camera 
   **/ 
  public BufferedImage getBufferedImage() { 
    return (BufferedImage)getImage(); 
  } 

  /** 
   *  Run method for Thread class 
   **/ 
  public void run() { 
    System.out.println("Capture thread starting..."); 
    Buffer b = new Buffer(); 

    /*  Simply loop forever grabbing images from the web cam and storing 
     *  them so that the user can retrieve them when required. 
     */ 
    while (true) { 
      try { 
        camStream.read(b); 
      } catch (Exception e) { 
        //  Ignore.  Nothing we can really do about this 
      } 

      Image i = converter.createImage(b); 
      accessInternalImage(i); 

      /*  If this is the first image we've collected we need to advertise 
       *  to the main thread that there is an image ready and then notify 
       *  the main thread in case it is waiting on the image 
       */ 
      if (!threadRunning) { 
        threadRunning = true; 

        synchronized (runLock) { 
          runLock.notifyAll(); 
        } 
      } 
    } 
  } 

  /** 
   *  Method called when a controller event is received (implements 
   *  ControllerListener interface) 
   * 
   * @param ce The controller event 
   **/ 
  public void controllerUpdate(ControllerEvent ce) { 
    if (ce instanceof RealizeCompleteEvent) { 
      synchronized (stateLock) { 
        stateLock.notifyAll(); 
      } 
    } 
  } 

  /** 
   *  Method that controls access to the global image variable.  This ensures 
   *  that there is no confusion over one thread reading an image whilst 
   *  another is writing to it 
   * 
   * @param image The image to store (null indicates retrieval of the image) 
   * @return The image (if the parameter was null) 
   **/ 
  private synchronized Image accessInternalImage(Image image) { 
    if (image == null) { 
      return currentImage; 
    } 

    currentImage = image; 
    return null; 
  } 
}
FrameGrabberException.java:
Code:
package webcamtoscreen; 

/** 
*  Exception that is thrown if the frame grabber doesn't work correctly 
**/ 
public class FrameGrabberException extends Exception { 
  public FrameGrabberException(String msg) { 
    super(msg); 
  } 
}

Laatst gewijzigd op 02-12-2002 om 21:51.
Met citaat reageren
Advertentie
Reageren


Regels voor berichten
Je mag geen nieuwe topics starten
Je mag niet reageren op berichten
Je mag geen bijlagen versturen
Je mag niet je berichten bewerken

BB code is Aan
Smileys zijn Aan
[IMG]-code is Aan
HTML-code is Uit

Spring naar


Alle tijden zijn GMT +1. Het is nu 21:24.