بدك “تشوف” الموسيقا؟
ما بعرف تماماً شو طلع معي. بس المهم إنو “شي” بيأرجيك الموسيقا.
#Music #Visualisation. A “breathing” concrete. A Reactive Music Canvas. Call it what you want. The nice thing about it is that it’s coded in a couple of hours and I love the result and how it “breathes” music. Hope you like it too!
Hi again! After a while!
I’m very happy to introduce my latest game on Android, #SyncSeven.
The game is a procedurally generated music game. It’s about enchanting your visual perception, feelings and musical taste! Just sync your taps with turns generated according to the peaks in the music. The more rapid the music the faster you should be, it’s the black and white Burst Mode! A bit hard at first, but the more you play, the more you will feel the music, and the more you’ll enjoy.
Now FREE on Google Play here And a trailer here.
Enjoy! Feedback welcomed!
This was the seminar I did in Artificial Neural Networks (ANN) back in 2011 at F.I.T.E Damascus, Syria. I encountered the problem of generating game content based on players preferences. For this, I have discussed two papers in the seminar:
- Towards Automatic Personalized Content Generation for Platform Games. Noor Shaker, Georgios N. Yannakakis, Member, IEEE, and Julian Togelius, Member, IEEE
- Feature Analysis for Modeling Game Content Quality. Noor Shaker, Georgios N. Yannakakis, Member, IEEE, and Julian Togelius, Member, IEEE
It’s interesting to say that I have been supervised by Noor Shaker (one of the author of the two mentioned papers) for a new research project (back in 2011 and ongoing) which investigates generating content for more immersive experience games; Generating Adaptive Content for First-Person Shooter Games and further working with her for Utilising Visual Features as Indicators of Players Engagement in Super Mario Bros.
Very interesting stuff can be found for this domain in the published paper of the authors: Noor Shaker, Georgios N. Yannakakis and Julian Togelius.