Message ID | 20130901064216.GN6617@n2100.arm.linux.org.uk (mailing list archive) |
---|---|
State | New, archived |
Headers | show |
On 09/01/2013 08:42 AM, Russell King - ARM Linux wrote: > [...] > 21:07 < rmk> well, someone needs to come up with a fix for these duplicated widgets, and that's not me. > 21:07 < rmk> because... at the moment... asoc is completely useless for this hardware > > The reason for my last couple of lines there is not that I'm being > difficult - I really don't know what the correct fix for the duplicated Well then say it that way. "I don't quite understand how this works yet and am unable to fix this myself can you help me figure it out". But maybe it is a cultural thing, who knows. Lets try to wrap up the situation: * The hardware has one audio stream, but two DAIs, one for SPDIF one for I2S. The same audio stream is sent to both DAIs at the same time (It is possible though to disable one or both of the DAIs). * This is something new and not supported by classical ASoC. * DPCM has support for this, but DPCM is still new, unstable, under-documented and apparently has a couple of bugs. * With non-DPCM ASoC it is possible to have two DAIs if they are not used at the same time (which is what I recommend you implement first, before trying to get DPCM running). I still don't know if you actually need to feature of being able to output the same audio signal to both DAIs, do you have such a board? But even then I still recommend to first get the non-DPCM either/or approach implemented and once that's working try to get DPCM running. Which probably involves fixing some of the DPCM issues in the core. As I said sending the same audio streams to two DAIs is something new and if there was no DPCM yet you'd need to add support for sending the same stream to multiple DAIs. So either way you'd have to get your hands dirty. And I'm sure people are willing to help you figure out the parts you don't understand yet if you ask _nicely_. I mean I don't come to you either if I have a new ARM SoC that's not supported yet and demand that you implement support for it and exclaim that the ARM port sucks because it doesn't support that SoC yet. - Lars
On Sun, Sep 01, 2013 at 09:42:29AM +0200, Lars-Peter Clausen wrote: > Lets try to wrap up the situation: > > * The hardware has one audio stream, but two DAIs, one for SPDIF one for > I2S. The same audio stream is sent to both DAIs at the same time (It is > possible though to disable one or both of the DAIs). More or less. To be more clear: audio DMA commences when either one or both outputs are enabled, and stops when both are disabled. This is why either only one can be enabled, or if both are to be enabled, both enable bits must be set in one register write, and when disabled, both bits must be cleared in one register write. So, lets say for argument sake that you wanted to go from disabled to a single output, then to dual output, back to single output, and finally back to disabled. You would need this sequence: - enable single output ... - disable single output - wait for audio unit to indicate not busy - enable both outputs ... - disable both outputs - wait for audio unit to indicate not busy - enable single output ... - disable single output - wait for audio unit to indicate not busy > * This is something new and not supported by classical ASoC. > > * DPCM has support for this, but DPCM is still new, unstable, > under-documented and apparently has a couple of bugs. > > * With non-DPCM ASoC it is possible to have two DAIs if they are not used at > the same time (which is what I recommend you implement first, before trying > to get DPCM running). If you'd look at my other responses, you'll see that this is what I tried back in May, and I was unhappy about that solution because: 1. there is no guarantee that they couldn't be used at the same time. 2. this results in two entirely separate "CPU DAI"s, each with their own independent sample rate/format settings, which if they happen to be used together will result in fighting over the same register(s). Moreover, this results in a completely different set of changes to the driver which are in an opposing direction to the DPCM approach. > I still don't know if you actually need to feature of being able to output > the same audio signal to both DAIs, do you have such a board? This board has the SPDIF connected to a TOSlink and a HDMI transmitter. It also has the I2S connected only to the HDMI transmitter, though it's common at the moment to only use the SPDIF output to drive them both. Having recently changed the TV connected to the HDMI, I find that where audio used to work at 48kHz and 44.1kHz with the old TV, it no longer works on anything but 44.1kHz. The old TV converted everything to analogue internally in its HDMI receiver before passing it for further processing. The new TV is a modern full HD model, so keeps everything in the digital domain. I have yet to work out why the TV is muting itself with 48kHz audio. However, audio continues to work via the TOSlink output at both sample rates. What I'm saying there is that we may need to have another codec in there so the HDMI transmitter has access to parameters like the sample rate, or we may have to switch it to using I2S for PCM audio and back to SPDIF for compressed audio (I'm hoping not because I think that's going to be an extremely hard problem to solve.) This is a brand new problem which I've only discovered during last week. As I have no SPDIF or HDMI analysers, I don't have an answer for this at the moment, and the only way I can solve it is by verifying the existing setup (which I believe is correct to the HDMI v1.3 spec) and experimenting, which will take some time. However, that's not a reason to hold up these patches - these patches do work, and allow audio to be used on this platform in at least some configurations. > But even then > I still recommend to first get the non-DPCM either/or approach implemented > and once that's working try to get DPCM running. Which probably involves > fixing some of the DPCM issues in the core. As I said sending the same audio > streams to two DAIs is something new and if there was no DPCM yet you'd need > to add support for sending the same stream to multiple DAIs. So either way > you'd have to get your hands dirty. Could you comment on the patch which creates the two front-end DAIs which I sent in a different sub-thread - the one which I indicated was from back in May time. It isn't quite suitable for submission because the base it applies to has changed since then, but it should be sufficient to give an idea of the solution I was trying to implement there. > And I'm sure people are willing to help > you figure out the parts you don't understand yet if you ask _nicely_. Can you then please explain why when I ask for help understanding DAPM in a "nice" way, the response I get is just "it's just a graph walk" and no further technical details? I explained DAPM as I understood it to someone who knows nothing about it a fortnight ago, and this is how I described DAPM with only a basic understanding of it, most of that gathered by having been through some of the code with printk()s to work out some of the problems I was seeing: | DAPM is a set of "widgets" representing various components of an | audio system. The widgets are linked together by a graph. Each | widget lives in a context - cpu, platform or codec. Some bindings | only happen within a context, others cross contexts (so linking the | CPU audio stream to the codec for example) I didn't want to go into the complexities of which widgets are activated when a stream starts playing, or the special cases with the various different types of widget that affect the walking and activation of the graph. Notice how much more information there - though it wasn't _that_ coherent (rather than using "linked" it should've been "bound"). The point is, I've described that there are widgets, there's a binding of widgets together, the widgets are associated with a context, and finally that there are restrictions on what bindings can happen. I've probably missed out quite a bit too without really knowing it - because I don't know it yet either. Yes, I do know about the Documentation/sound/alsa/soc/dapm.txt document. > I mean I don't come to you either if I have a new ARM SoC that's not > supported yet and demand that you implement support for it and exclaim > that the ARM port sucks because it doesn't support that SoC yet. If you come to me and ask about something in ARM, then you will most likely get something more than a few words explaining a big chunk of code - a bit like the oops report I disected last night and provided full reasoning of the conclusion that I came to (SDRAM failure / hardware fault on bit 8 of the SDRAM data bus.)
On 09/01/2013 10:51 AM, Russell King - ARM Linux wrote: > On Sun, Sep 01, 2013 at 09:42:29AM +0200, Lars-Peter Clausen wrote: >> Lets try to wrap up the situation: >> >> * The hardware has one audio stream, but two DAIs, one for SPDIF one for >> I2S. The same audio stream is sent to both DAIs at the same time (It is >> possible though to disable one or both of the DAIs). > > More or less. To be more clear: audio DMA commences when either one or > both outputs are enabled, and stops when both are disabled. > > This is why either only one can be enabled, or if both are to be enabled, > both enable bits must be set in one register write, and when disabled, > both bits must be cleared in one register write. > > So, lets say for argument sake that you wanted to go from disabled to a > single output, then to dual output, back to single output, and finally > back to disabled. You would need this sequence: > > - enable single output > ... > - disable single output > - wait for audio unit to indicate not busy > - enable both outputs > ... > - disable both outputs > - wait for audio unit to indicate not busy > - enable single output > ... > - disable single output > - wait for audio unit to indicate not busy > >> * This is something new and not supported by classical ASoC. >> >> * DPCM has support for this, but DPCM is still new, unstable, >> under-documented and apparently has a couple of bugs. >> >> * With non-DPCM ASoC it is possible to have two DAIs if they are not used at >> the same time (which is what I recommend you implement first, before trying >> to get DPCM running). > > If you'd look at my other responses, you'll see that this is what I tried > back in May, and I was unhappy about that solution because: > > 1. there is no guarantee that they couldn't be used at the same time. > 2. this results in two entirely separate "CPU DAI"s, each with their > own independent sample rate/format settings, which if they happen > to be used together will result in fighting over the same register(s). I know, but you can make it policy that only one of them may be used at a time. Furthermore you can add a check to the startup() callback to return an error, if the other DAI is active. > > Moreover, this results in a completely different set of changes to the > driver which are in an opposing direction to the DPCM approach. > I think the patch is actually going to be maybe a 100 lines or so and it gives you something to work with and unlike your current approach is not trying to work around the framework. Then you can add the other patches adding the SPDIF controls on top of it. Once that's done you can concentrate on trying to get DPCM running. >> I still don't know if you actually need to feature of being able to output >> the same audio signal to both DAIs, do you have such a board? > > This board has the SPDIF connected to a TOSlink and a HDMI transmitter. > It also has the I2S connected only to the HDMI transmitter, though it's > common at the moment to only use the SPDIF output to drive them both. Ok, so things will work fine with the either/or approach for now. [...] >> But even then >> I still recommend to first get the non-DPCM either/or approach implemented >> and once that's working try to get DPCM running. Which probably involves >> fixing some of the DPCM issues in the core. As I said sending the same audio >> streams to two DAIs is something new and if there was no DPCM yet you'd need >> to add support for sending the same stream to multiple DAIs. So either way >> you'd have to get your hands dirty. > > Could you comment on the patch which creates the two front-end DAIs which > I sent in a different sub-thread - the one which I indicated was from back > in May time. It isn't quite suitable for submission because the base it > applies to has changed since then, but it should be sufficient to give an > idea of the solution I was trying to implement there. The patch looks OK, except for maybe the way the names for the DAIs are created. That probably something that's better handled in the core in a generic way. - Lars
On Sun, Sep 01, 2013 at 09:51:21AM +0100, Russell King - ARM Linux wrote: > On Sun, Sep 01, 2013 at 09:42:29AM +0200, Lars-Peter Clausen wrote: > > * With non-DPCM ASoC it is possible to have two DAIs if they are not used at > > the same time (which is what I recommend you implement first, before trying > > to get DPCM running). > If you'd look at my other responses, you'll see that this is what I tried > back in May, and I was unhappy about that solution because: > 1. there is no guarantee that they couldn't be used at the same time. > 2. this results in two entirely separate "CPU DAI"s, each with their > own independent sample rate/format settings, which if they happen > to be used together will result in fighting over the same register(s). These aren't issues for either/or, the whole point here is that the machine driver will be limited to only connecting one of the DAIs so there will be no way for userspace to interact with the unused DAI. As Lars-Peter says you can add explicit checks for this in the code if you are concerned about system integrators getting this wrong. > Moreover, this results in a completely different set of changes to the > driver which are in an opposing direction to the DPCM approach. Like Lars-Peter says it really, really shouldn't be - what moving to DPCM should be doing with this code is mostly moving the code around a bit to pull the bits that are shared into a front end DAI. The most substantial change should be handling the enables but that shouldn't take much code at all, your curent patch does it in 35 lines and I'd not expect it to be much different in a DPCM world. > > And I'm sure people are willing to help > > you figure out the parts you don't understand yet if you ask _nicely_. > Can you then please explain why when I ask for help understanding DAPM > in a "nice" way, the response I get is just "it's just a graph walk" > and no further technical details? Ask more detailed questions and engage in a discussion; the reason you keep on getting the same responses is that you tend to repeat the same requests a lot. Something like "I understand the big picture but I am trying to do X and need to know Y because Z" (with all of X, Y and Z being important) is usually a good approach. The public interface for DAPM is that the widgets get created with sensible types and hooked up together then powered up as needed, if something needs to know specifics of that process like exactly when a given register will be written that's a worrying implementation detail. > | DAPM is a set of "widgets" representing various components of an > | audio system. The widgets are linked together by a graph. Each > | widget lives in a context - cpu, platform or codec. Some bindings > | only happen within a context, others cross contexts (so linking the > | CPU audio stream to the codec for example) This last statement is not the case; you can see the route connecting code in snd_soc_dapm_add_route() - there is no case in which it will only try to match within a single context. As I have said to you without more information it is hard to tell what problems you are seeing when you are trying this. It could be a case of trying to create routes before the widgets are added, or it could be a case of trying to create inter device links with widgets with globally duplicated names (though that would give wrong routes rather than no routes so I suspect sequencing). > > I mean I don't come to you either if I have a new ARM SoC that's not > > supported yet and demand that you implement support for it and exclaim > > that the ARM port sucks because it doesn't support that SoC yet. > If you come to me and ask about something in ARM, then you will most > likely get something more than a few words explaining a big chunk of > code - a bit like the oops report I disected last night and provided > full reasoning of the conclusion that I came to (SDRAM failure / > hardware fault on bit 8 of the SDRAM data bus.) The bit about the way in which support is requested is important here - it really can make a really big difference how one asks.
On Sun, Sep 01, 2013 at 12:51:16PM +0100, Mark Brown wrote: > Like Lars-Peter says it really, really shouldn't be - what moving to > DPCM should be doing with this code is mostly moving the code around a > bit to pull the bits that are shared into a front end DAI. The most > substantial change should be handling the enables but that shouldn't > take much code at all, your curent patch does it in 35 lines and I'd not > expect it to be much different in a DPCM world. The delta between the DPCM version and the dual-DAI version is over 300 lines of change - the methods employed by these two methods are completely different. Maybe you could look at the patch and suggest where I'm going wrong? > On Sun, Sep 01, 2013 at 09:51:21AM +0100, Russell King - ARM Linux wrote: > > Can you then please explain why when I ask for help understanding DAPM > > in a "nice" way, the response I get is just "it's just a graph walk" > > and no further technical details? > > Ask more detailed questions and engage in a discussion; the reason you > keep on getting the same responses is that you tend to repeat the same > requests a lot. Something like "I understand the big picture but I am > trying to do X and need to know Y because Z" (with all of X, Y and Z > being important) is usually a good approach. When you don't even understand the "big picture", it's hard to know where to start. That's why starting off with a simple question is the right thing to do - and you expect to get a simple but informative response, so that helps you to frame more specific questions later. If you start from a position of knowing very little, it's exceedingly difficult to ask specific questions. > The public interface for DAPM is that the widgets get created with > sensible types and hooked up together then powered up as needed, if > something needs to know specifics of that process like exactly when a > given register will be written that's a worrying implementation detail. > > > | DAPM is a set of "widgets" representing various components of an > > | audio system. The widgets are linked together by a graph. Each > > | widget lives in a context - cpu, platform or codec. Some bindings > > | only happen within a context, others cross contexts (so linking the > > | CPU audio stream to the codec for example) > > This last statement is not the case; you can see the route connecting > code in snd_soc_dapm_add_route() - there is no case in which it will > only try to match within a single context. > > As I have said to you without more information it is hard to tell what > problems you are seeing when you are trying this. It could be a case of > trying to create routes before the widgets are added, or it could be a > case of trying to create inter device links with widgets with globally > duplicated names (though that would give wrong routes rather than no > routes so I suspect sequencing). Right, so, I have a widget with a stream name, and the stream name matches a CPU DAI stream. If I register this widget against the platform DAPM context, then there is no connection between this widget and the CPU DAI streams. (Bear in mind that at the time I tried this, I had disabled the creation of the stream widgets that were overwriting the platform stream widgets - because you were not providing any useful information to work around that problem.) If I register this widget against the CPU DAI DAPM context, the stream name is matched to the CPU DAI streams, and a connection is made between the stream widgets and my widget. This is what I mean by "some bindings only happen within a context".
On Sun, Sep 01, 2013 at 01:15:18PM +0100, Russell King - ARM Linux wrote: > On Sun, Sep 01, 2013 at 12:51:16PM +0100, Mark Brown wrote: > > Like Lars-Peter says it really, really shouldn't be - what moving to > > DPCM should be doing with this code is mostly moving the code around a > > bit to pull the bits that are shared into a front end DAI. The most > > substantial change should be handling the enables but that shouldn't > > take much code at all, your curent patch does it in 35 lines and I'd not > > expect it to be much different in a DPCM world. > The delta between the DPCM version and the dual-DAI version is over 300 > lines of change - the methods employed by these two methods are completely > different. > Maybe you could look at the patch and suggest where I'm going wrong? The diff doesn't look that worrying to me - a large chunk of it is about moving code around to register multiple DAIs which is something that the final DPCM code is going to want to do anyway. A lot of the changes in the hw_params() function would probably stay the same too, though the function would move to the front end DAI I expect. > > Ask more detailed questions and engage in a discussion; the reason you > > keep on getting the same responses is that you tend to repeat the same > > requests a lot. Something like "I understand the big picture but I am > > trying to do X and need to know Y because Z" (with all of X, Y and Z > > being important) is usually a good approach. > When you don't even understand the "big picture", it's hard to know > where to start. That's why starting off with a simple question is > the right thing to do - and you expect to get a simple but informative > response, so that helps you to frame more specific questions later. > If you start from a position of knowing very little, it's exceedingly > difficult to ask specific questions. So say that - explain that you find it hard to relate the answer to what you're trying to accomplish, doing things like providing more detail on the problem you're trying to solve and highlighting anything that you've noticed is unusual to try to help people understand what might be missing from their answers. > Right, so, I have a widget with a stream name, and the stream name matches > a CPU DAI stream. Ah, this is stream names not regular routes - new code should just use a normal DAPM route to connect to the AIF widgets. Stream names are being phased out now that we can just use DAPM routes (which mean that we don't need to go through every single widget doing string checks every time we start or stop a stream). > If I register this widget against the platform DAPM context, then there is > no connection between this widget and the CPU DAI streams. (Bear in mind > that at the time I tried this, I had disabled the creation of the stream > widgets that were overwriting the platform stream widgets - because you > were not providing any useful information to work around that problem.) > If I register this widget against the CPU DAI DAPM context, the stream > name is matched to the CPU DAI streams, and a connection is made between > the stream widgets and my widget. > This is what I mean by "some bindings only happen within a context". This is happening because you're doing things based on the stream name - stream names aren't really DAPM routes, they're magic things which only work on the same DAPM context. Previously we used to go through and do the string compare to look up widgets outside of DAPM at runtime which on a per device basis (though that made little difference since the only devices supported were CODECs which are always a single device and context), the current implementation replicates this functionality exactly using DAPM routes and the long term plan is that no stream names will be specified in widgets. If you omit the stream name and add the routes via a routing table you should at least get some error messages.
diff --git a/sound/soc/codecs/spdif_transciever.c b/sound/soc/codecs/spdif_transciever.c index 553708d..fde0150 100644 --- a/sound/soc/codecs/spdif_transciever.c +++ b/sound/soc/codecs/spdif_transciever.c @@ -34,7 +34,7 @@ static const struct snd_soc_dapm_widget dit_widgets[] = { }; static const const struct snd_soc_dapm_route dit_routes[] = { - { "spdif-out", NULL, "Playback" }, + { "spdif-out", NULL, "spdif-Playback" }, }; static struct snd_soc_codec_driver soc_codec_spdif_dit = { @@ -47,7 +47,7 @@ static struct snd_soc_codec_driver soc_codec_spdif_dit = { static struct snd_soc_dai_driver dit_stub_dai = { .name = "dit-hifi", .playback = { - .stream_name = "Playback", + .stream_name = "spdif-Playback", .channels_min = 1, .channels_max = 384, .rates = STUB_RATES, diff --git a/sound/soc/kirkwood/kirkwood-spdif.c b/sound/soc/kirkwood/kirkwood-spdif.c index e5257ec..d1ee8f7 100644 --- a/sound/soc/kirkwood/kirkwood-spdif.c +++ b/sound/soc/kirkwood/kirkwood-spdif.c @@ -21,7 +21,7 @@ static struct snd_soc_ops kirkwood_spdif_ops = { #include <sound/soc.h> static const struct snd_soc_dapm_route routes[] = { - { "Playback", NULL, "spdifdo" }, + { "spdif-Playback", NULL, "spdifdo" }, }; static struct snd_soc_dai_link kirkwood_spdif_dai0[] = { @@ -38,9 +38,18 @@ static struct snd_soc_dai_link kirkwood_spdif_dai0[] = { static struct snd_soc_dai_link kirkwood_spdif_dai1[] = { { .name = "S/PDIF1", - .stream_name = "IEC958 Playback", + .stream_name = "Audio Playback", .platform_name = "mvebu-audio.1", .cpu_dai_name = "mvebu-audio.1", + .codec_name = "snd-soc-dummy", + .codec_dai_name = "snd-soc-dummy-dai", + .dynamic = 1, + }, { + .name = "Codec", + .stream_name = "IEC958 Playback", + .cpu_dai_name = "snd-soc-dummy-dai", + .platform_name = "snd-soc-dummy", + .no_pcm = 1, .codec_dai_name = "dit-hifi", .codec_name = "spdif-dit", }, @@ -70,7 +79,7 @@ static int kirkwood_spdif_probe(struct platform_device *pdev) card->dai_link = kirkwood_spdif_dai0; else card->dai_link = kirkwood_spdif_dai1; - card->num_links = 1; + card->num_links = 2; card->dapm_routes = routes; card->num_dapm_routes = ARRAY_SIZE(routes); card->dev = &pdev->dev; diff --git a/sound/soc/soc-pcm.c b/sound/soc/soc-pcm.c index ccb6be4..381df28 100644 --- a/sound/soc/soc-pcm.c +++ b/sound/soc/soc-pcm.c @@ -1634,8 +1634,8 @@ static int dpcm_fe_dai_prepare(struct snd_pcm_substream *substream) /* there is no point preparing this FE if there are no BEs */ if (list_empty(&fe->dpcm[stream].be_clients)) { - dev_err(fe->dev, "ASoC: no backend DAIs enabled for %s\n", - fe->dai_link->name); +// dev_err(fe->dev, "ASoC: no backend DAIs enabled for %s\n", +// fe->dai_link->name); ret = -EINVAL; goto out; }