MODULE: TotalLipSync v0.5

Started by Snarky, Tue 18/04/2017 19:22:17

Previous topic - Next topic

eri0o

#60
@Snarky uhm, does getting the time of the voice clip in milliseconds would be better? Because I think you can get it from it's audio channel.

I saw this in the manual,

QuoteOne channel is always reserved for the speech voice-over and you cannot change that, so you have 15 at your disposal.

I wonder if this means the voiceclip AudioChannel is fixed. The other approach is to ask the player to pass this audio channel and let them figure it out.

Snarky

Quote from: eri0o on Sun 01/10/2023 10:31:35Because I think you can get it from it's audio channel.

Yes, that could work (voice clips always play on channel 0), though I've never quite trusted AudioChannel.PositionMs (a long time ago I had some trouble when it didn't seem to work properly).

I do also want to support syncing without an audio file, for more general-purpose synced/scripted animations, but the frame counter should be fine for that purpose.

Dave Gilbert

#62
Quote from: Snarky on Sun 01/10/2023 07:04:49
Quote from: Dave Gilbert on Sat 30/09/2023 23:36:53So I'm wondering if the problem lies with the rhubarb end, or if me changing the code like this messed things up. For example, what was the reason that frame 7 was originally unassigned? Was it essential to keep that?

Hmm, changing the mapping should be fine (but see below), though I would strongly recommend that instead of changing the AutoMapPhonemes implementation in the module code, you set up a manual mapping outside of the module when you initialize it—so, instead of calling AutoMapPhonemes, you use those exact AddPhonemeMapping calls yourself to set it up the way you want it.

Ok, that I can do. Thanks! Just to be sure, I did this on game_start():

Code: ags
  TotalLipSync.Init(eLipSyncRhubarb);    // Or whatever lip sync format you're using
  //TotalLipSync.AutoMapPhonemes(); <--- commented out!
  TotalLipSync.AddPhonemeMapping("X",8);
  TotalLipSync.AddPhonemeMapping("A",0);  // mbp
  TotalLipSync.AddPhonemeMapping("B",1);  // other consonants
  TotalLipSync.AddPhonemeMapping("C",2);  // EH/AH/EY etc. (bed, hut, bait)
  TotalLipSync.AddPhonemeMapping("D",3);  // AA/AE/AY (father, bat, like)
  TotalLipSync.AddPhonemeMapping("E",4);  // AO/OW (thaw, slow)
  TotalLipSync.AddPhonemeMapping("F",5);  // UW/OY/UH/OW (you, toy, poor)
  TotalLipSync.AddPhonemeMapping("G",6);  // F/V (fine, very)
  TotalLipSync.AddPhonemeMapping("H",7);  // L (letter)	

QuoteThe reason why the auto-setup is arranged the way it is (skipping frame 7) is because the other lipsync data formats, Moho and Pamela/Annosoft, distinguish between "ooh" sounds and "w" sounds, but Rhubarb uses the same phoneme/mouth shape for both. The auto-setup maps the frames the same way for all the formats, allowing you to set up the animation view once, and then use whichever lipsync data format you like. You can even switch formats along the way, if for example you want to hand-sync some scenes in Pamela format and do others automatically in Rhubarb (though you'd need to reset the module with the new format and mappings whenever you switch). I don't remember precisely why it's frame 7 that is the "optional" one rather than the last frame, but I would guess I based the order on some standard or convention for lipsync setups.

Okay. So since I never use anything aside from Rhubarb, I don't need to worry about that? I can use my version of the code?

QuoteAnyway, to get to the point of your question: looking over the code, it does assume that Frame 0 is the non-speaking frame, as in the AGS convention for NormalView and SpeechView; this frame is set when the animation file does not specify a frame. (This is necessary because some of the data formats can have gaps, but I don't remember if this applies to Rhubarb.) You could try to change it here:

Aha! Yes I have noticed that when the characters stop speaking they always end up on the wrong mouth shape. I added some code to manually change the frame when they stop speaking, but your solution is better! Thanks!

edit: Hmm. I notice that the mouth shape still finishes on the wrong frame. That's on me (because the way the mouth shape frames are imported) so if there's no way to fix that, I'll just continue using the workaround code I created.

Akril15

#63
Sorry for resurrecting this thread, but I'm having trouble getting this module working. The voice line is playing in the game and I've got the corresponding .dat file in Compiled/Windows/sync, but for some reason, the mouth animation doesn't match what I exported from Papagayo at all. It's a 4-second line, but all the character's mouth does is open for one frame, and remains closed for the rest of the line's duration. I tried remaking the .dat file, but there was little change. What's going on here?

EDIT: I've manually assigned phonemes to the frames (skipping frame 7), but I'm only using 8 of the 9 phonemes in AGS's Lip Sync panel. I'm using text-reading lip sync elsewhere in the game, but I disabled it (using OPT_LIPSYNCTEXT) when I was testing this module.


Looks like I was mistaken about where to put the .dat file. I placed it in the Speech folder, now I'm just dealing with some "No frame found to match phoneme" problems.

EDIT 2: I still can't get this module to work. no matter where I place the dat file, the character's line plays, but the mouth doesn't move. I've tried placing the .dat file (exported from Papagayo) in Compiled/sync, I've tried placing it in the Speech folder, and nothing changes. What am I doing wrong? (Also: I should reiterate that while I am using AGS's text-reading lip sync elsewhere in the game, I'm disabling it with  OPT_LIPSYNCTEXT before running a SaySync line.)

SMF spam blocked by CleanTalk