[LAD] FW: [piksel] [Jack-Devel] Re : The future of videojack ?

From: Ivica Ico Bukvic <ico@email-addr-hidden>
Date: Thu May 08 2008 - 05:12:39 EEST

FYI

-----Original Message-----
From: piksel-bounces@email-addr-hidden [mailto:piksel-bounces@email-addr-hidden] On Behalf Of
salsaman@email-addr-hidden
Sent: Wednesday, May 07, 2008 6:31 PM
To: p1k53l workshop
Cc: jack-devel@email-addr-hidden
Subject: Re: [piksel] [Jack-Devel] Re : The future of videojack ?

On Wed, May 7, 2008 22:04, Stéphane Letz wrote:
>
> Le 7 mai 08 à 21:37, Juuso Alasuutari a écrit :
>
>> Stéphane Letz wrote:
>>> Video in jack1 won't happen because of several reasons that can be
>>> explained again: we want to fix and release jack1 soon and video
>>> in jack is a too big change to be integrated in the current state
>>> of the proposed patch.
>>> The future of jack is now jack2, based on the jackdmp new
>>> implementation (http://www.grame.fr/~letz/jackdmp.html). A lot of
>>> work has already been done in this code base that is now API
>>> equivalent to jack2. New features are already worked on like the
>>> DBUS based control (developed in the "control" branch) and NetJack
>>> rework (developed in the "network" branch).
>>> I think a combined "video + audio in a unique server" approach is
>>> perfectly possible: this would require having 2 separated graph
>>> for audio and video running at their own rate. Video and audio
>>> would be done in different callbacks and thus handled in different
>>> threads (probably running at 2 different priorities so that audio
>>> can "interrupt" video). Obviously doing that the right way would
>>> require a bit of work, but is probably much easier to design and
>>> implement in jackd2 codebase.
>>> Thus I think a better overall approach to avoid "video jack fork"
>>> is to work in this direction, possibly by implementing video jack
>>> with the "separated server" idea first (since is is easier to
>>> implement). This could be started right away in a jack2 branch.
>>
>> I'll throw in my 2 Euro cents.
>>
>> If the VideoJACK crowd feels that JACK2 development is taking too
>> slow and decide to continue with their fork, may I suggest that we
>> all still discuss and draft a proper video API together? If a fork
>> happens out of practical reasons, it would be best to make sure
>> that switching video software to use JACK2 later on will be as
>> painless as possible.
>>
>> Technical issues aside, I wish that those affiliated with VideoJACK
>> do not feel that their needs are neglected by the JACK developers.
>> I hope that the recent discussion has proved that people in this
>> camp are willing to improve JACK in this respect. Perhaps we could
>> move on and try to find more common ground?
>>
>> Juuso
>
> Yes sure.
>
> Where is the latest state of the video patch for jack? I can have a
> look and see how easy/difficult it would be to implement that in a
> jackdmp/jack2 branch.
>
> Stephane
>
> _______________________________________________
> piksel mailing list
> piksel@email-addr-hidden
> https://www.bek.no/mailman/listinfo/piksel
> http://www.piksel.no
>
>

Please forward to the LAD mailing list, since I am not subscribed there,
and unable to post.

I will just describe briefly the current API. It would be nice everybody
would agree that this is OK, otherwise please can we resolve this quickly
- the project is going slowly enough as it is.

----------------------------------------------------------

The videojack patch adds a new port type - video.
#define JACK_VIDEO_PORT_TYPE 2

There is no mixing function for video, since there are dozens or maybe
hundreds of different ways to mix two or more video frames together; all
mixing is done by a host application.

Therefore each video buffer can have only one source, but may have
multiple sinks.

The size of the video buffer is initially set to 0x0, and the colorspace
is set to the default (JACK_VIDEO_COLORSPACE_RGBA32).

The source client only may set the frame size at any time, and may also
change the colorspace. Although only RGBA32 is supported officially, other
colorspaces may be used in private networks.

The source client sets width and height using:
int jack_video_set_width_and_height (jack_client_t *client, jack_port_t
*port, uint32_t width, uint32_t height)

and may set the colorspace with:
int jack_video_set_colorspace(jack_client_t *client, jack_port_t *port,
jack_video_colorspace_t cspace)

These functions may only be called after the client is activated by the
source.

The functions also set the buffer size to 4*width*height.

Likewise, there are the functions:
uint32_t jack_video_get_width(jack_client_t *client, jack_port_t *port)
uint32_t jack_video_get_height(jack_client_t *client, jack_port_t *port)
jack_video_colorspace_t jack_video_get_colorspace(jack_client_t *client,
jack_port_t *port)

A client may set callbacks for these:

int jack_set_video_colorspace_callback (jack_client_t *client,
                                        JackVideoColorspaceCallback
video_colorspace_callback,
                                        void *arg);

int jack_set_video_size_callback (jack_client_t *client,
                                  JackVideoSizeCallback video_size_callback,
                                  void *arg);

Although probably the client would only get a buffer size change message,
since jack can only send one message per change (seems to be a limitation
in jackd).

There is also a convenience function:
void* jack_video_get_framebuffer(jack_port_t* video_port)
which just does:
  return jack_port_get_buffer (video_port, 1);

And that's it ! This is enough to get video working in jack.

There are a couple of changes I plan to make, which are just nice-to-have:
- make rate a float (for the video clock)
- use an environment variable to define the location of ~/.jackdrc (this
will go away if video and audio clocks can be combined in one server)

Also I will patch jack_connect, jack_disconnect, and jack_lsp to take a
server name. This is necessary right now if running two servers, video and
audio. But it seems like a good idea to keep it in anyway.

And as suggested, jack_set_video_process_callback() will just call
jack_set_process_callback() (for now). Later it could link to a video
clock in a combined server.

There was a further suggestion that the input client can set a "frame
count", which can be read by the output clients. This should also be
feasable.

--------------------------------------------------------

The current code is at:
http://bekstation.bek.no/piksel/pikseldev/jack-vjack5.tar.bz2

The startup parameters I use are:
/usr/bin/jackd -p 10 -t 40 -d dummy -r 25 -p 1 -P 0 -C 0

Note the last -p 1 is necessary, otherwise you get a floating point
exception. All other parameters can be adjusted as necessary.

There are also various clients - jack_video_test_generator and
jack_video_output are shipped by default.

On the piksel site there is a hacked version of camorama:
http://bekstation.bek.no/piksel/pikseldev/camorama-vjack.tar.bz2

and ekiga:
http://www.xs4all.nl/~salsaman/ekiga-2.0.12-hacked-for-vjack.tar.bz2

LiVES also supports vjack in an vjack out. You must compile it with
--enable-vjack. The vjack output client can be selected in
Preferences/Playback. The input client (generator) must be bound to an
effect key (VJ/Realtime effect mapping), and then activated.

Regards,
Salsaman.

_______________________________________________
piksel mailing list
piksel@email-addr-hidden
https://www.bek.no/mailman/listinfo/piksel
http://www.piksel.no

_______________________________________________
Linux-audio-dev mailing list
Linux-audio-dev@email-addr-hidden
http://lists.linuxaudio.org/mailman/listinfo/linux-audio-dev
Received on Thu May 8 08:15:01 2008

This archive was generated by hypermail 2.1.8 : Thu May 08 2008 - 08:15:02 EEST